sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
ed0074e1f99753cc66d9571492f40d812a6b5ab1 |
# Dataset Card for Evaluation run of vicgalle/CarbonBeagle-11B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vicgalle/CarbonBeagle-11B](https://huggingface.co/vicgalle/CarbonBeagle-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vicgalle__CarbonBeagle-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T15:49:25.767199](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__CarbonBeagle-11B/blob/main/results_2024-01-21T15-49-25.767199.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6705898940993614,
"acc_stderr": 0.03156444593375738,
"acc_norm": 0.6708641524489116,
"acc_norm_stderr": 0.03221364260604881,
"mc1": 0.5458996328029376,
"mc1_stderr": 0.01742959309132352,
"mc2": 0.6942950160280645,
"mc2_stderr": 0.015190819809321073
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.013385021637313572,
"acc_norm": 0.7184300341296929,
"acc_norm_stderr": 0.013143376735009022
},
"harness|hellaswag|10": {
"acc": 0.7221668990240988,
"acc_stderr": 0.004470152081675125,
"acc_norm": 0.8892650866361282,
"acc_norm_stderr": 0.0031316226281990814
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361073,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361073
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.034370793441061344,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.034370793441061344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48677248677248675,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.48677248677248675,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823085,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823085
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8383838383838383,
"acc_stderr": 0.026225919863629283,
"acc_norm": 0.8383838383838383,
"acc_norm_stderr": 0.026225919863629283
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970565,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970565
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8587155963302753,
"acc_stderr": 0.014933868987028084,
"acc_norm": 0.8587155963302753,
"acc_norm_stderr": 0.014933868987028084
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.02704462171947408,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.02704462171947408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150877,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917669,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876168,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876168
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7658959537572254,
"acc_stderr": 0.022797110278071124,
"acc_norm": 0.7658959537572254,
"acc_norm_stderr": 0.022797110278071124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48044692737430167,
"acc_stderr": 0.016709709877662,
"acc_norm": 0.48044692737430167,
"acc_norm_stderr": 0.016709709877662
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7427652733118971,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.7427652733118971,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.023468429832451156,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.023468429832451156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5212765957446809,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.5212765957446809,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5149934810951761,
"acc_stderr": 0.01276449320219325,
"acc_norm": 0.5149934810951761,
"acc_norm_stderr": 0.01276449320219325
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7279411764705882,
"acc_stderr": 0.02703304115168146,
"acc_norm": 0.7279411764705882,
"acc_norm_stderr": 0.02703304115168146
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.018521756215423027,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.018521756215423027
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.026711430555538405,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.026711430555538405
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018512,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018512
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5458996328029376,
"mc1_stderr": 0.01742959309132352,
"mc2": 0.6942950160280645,
"mc2_stderr": 0.015190819809321073
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.6694465504169825,
"acc_stderr": 0.012957496367085026
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vicgalle__CarbonBeagle-11B | [
"region:us"
] | 2024-01-21T15:51:43+00:00 | {"pretty_name": "Evaluation run of vicgalle/CarbonBeagle-11B", "dataset_summary": "Dataset automatically created during the evaluation run of model [vicgalle/CarbonBeagle-11B](https://huggingface.co/vicgalle/CarbonBeagle-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__CarbonBeagle-11B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T15:49:25.767199](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__CarbonBeagle-11B/blob/main/results_2024-01-21T15-49-25.767199.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6705898940993614,\n \"acc_stderr\": 0.03156444593375738,\n \"acc_norm\": 0.6708641524489116,\n \"acc_norm_stderr\": 0.03221364260604881,\n \"mc1\": 0.5458996328029376,\n \"mc1_stderr\": 0.01742959309132352,\n \"mc2\": 0.6942950160280645,\n \"mc2_stderr\": 0.015190819809321073\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.013385021637313572,\n \"acc_norm\": 0.7184300341296929,\n \"acc_norm_stderr\": 0.013143376735009022\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7221668990240988,\n \"acc_stderr\": 0.004470152081675125,\n \"acc_norm\": 0.8892650866361282,\n \"acc_norm_stderr\": 0.0031316226281990814\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.034370793441061344,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.034370793441061344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6170212765957447,\n \"acc_stderr\": 0.03177821250236922,\n \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.03177821250236922\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823085,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823085\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8383838383838383,\n \"acc_stderr\": 0.026225919863629283,\n \"acc_norm\": 0.8383838383838383,\n \"acc_norm_stderr\": 0.026225919863629283\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8587155963302753,\n \"acc_stderr\": 0.014933868987028084,\n \"acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.014933868987028084\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947408,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947408\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.03021683101150877,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.03021683101150877\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.013890862162876168,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.013890862162876168\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071124,\n \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071124\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48044692737430167,\n \"acc_stderr\": 0.016709709877662,\n \"acc_norm\": 0.48044692737430167,\n \"acc_norm_stderr\": 0.016709709877662\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7427652733118971,\n \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.7427652733118971,\n \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.023468429832451156,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.023468429832451156\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5212765957446809,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.5212765957446809,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5149934810951761,\n \"acc_stderr\": 0.01276449320219325,\n \"acc_norm\": 0.5149934810951761,\n \"acc_norm_stderr\": 0.01276449320219325\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.02703304115168146,\n \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.02703304115168146\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7009803921568627,\n \"acc_stderr\": 0.018521756215423027,\n \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.018521756215423027\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.026711430555538405,\n \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.026711430555538405\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018512,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018512\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5458996328029376,\n \"mc1_stderr\": 0.01742959309132352,\n \"mc2\": 0.6942950160280645,\n \"mc2_stderr\": 0.015190819809321073\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6694465504169825,\n \"acc_stderr\": 0.012957496367085026\n }\n}\n```", "repo_url": "https://huggingface.co/vicgalle/CarbonBeagle-11B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|arc:challenge|25_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|gsm8k|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hellaswag|10_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T15-49-25.767199.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["**/details_harness|winogrande|5_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T15-49-25.767199.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T15_49_25.767199", "path": ["results_2024-01-21T15-49-25.767199.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T15-49-25.767199.parquet"]}]}]} | 2024-01-21T15:52:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of vicgalle/CarbonBeagle-11B
Dataset automatically created during the evaluation run of model vicgalle/CarbonBeagle-11B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T15:49:25.767199(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of vicgalle/CarbonBeagle-11B\n\n\n\nDataset automatically created during the evaluation run of model vicgalle/CarbonBeagle-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T15:49:25.767199(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vicgalle/CarbonBeagle-11B\n\n\n\nDataset automatically created during the evaluation run of model vicgalle/CarbonBeagle-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T15:49:25.767199(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
482559810ccdd8a5a21f33666b80fd9bec6c5039 |
# Dataset of aliza/アリーザ (Granblue Fantasy)
This is the dataset of aliza/アリーザ (Granblue Fantasy), containing 171 images and their tags.
The core tags of this character are `long_hair, horns, breasts, red_eyes, pointy_ears, large_breasts, ponytail, earrings, bow, hair_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 171 | 198.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aliza_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 171 | 124.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aliza_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 383 | 255.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aliza_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 171 | 178.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aliza_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 383 | 344.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aliza_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aliza_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, draph, jewelry, simple_background, solo, white_background, hair_pulled_back, looking_at_viewer, blush, open_mouth, nipples, bare_shoulders, cleavage |
| 1 | 11 |  |  |  |  |  | 1girl, draph, hair_pulled_back, solo, looking_at_viewer, cleavage_cutout, jewelry, red_dress, simple_background, white_background, bare_shoulders, china_dress, smile, white_thighhighs, headpiece, open_mouth, very_long_hair, blush |
| 2 | 5 |  |  |  |  |  | 1girl, ass, china_dress, draph, hair_pulled_back, jewelry, solo, white_panties, white_thighhighs, looking_at_viewer, pantyshot, red_dress, sideboob, bare_shoulders, fire, kicking, boots, headpiece, leg_up, open_mouth, ribbed_legwear |
| 3 | 16 |  |  |  |  |  | 1girl, draph, solo, midriff, navel, single_braid, very_long_hair, bandaged_arm, cleavage, headpiece, smile, bare_shoulders, looking_at_viewer, hair_pulled_back, long_braid, collarbone, tank_top, belt, simple_background, white_background, arm_wrap, bandaged_hand, blush, crop_top, red_bow, black_pants |
| 4 | 9 |  |  |  |  |  | 1girl, draph, hair_pulled_back, 1boy, blush, hetero, jewelry, nipples, penis, solo_focus, huge_breasts, paizuri, censored, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | draph | jewelry | simple_background | solo | white_background | hair_pulled_back | looking_at_viewer | blush | open_mouth | nipples | bare_shoulders | cleavage | cleavage_cutout | red_dress | china_dress | smile | white_thighhighs | headpiece | very_long_hair | ass | white_panties | pantyshot | sideboob | fire | kicking | boots | leg_up | ribbed_legwear | midriff | navel | single_braid | bandaged_arm | long_braid | collarbone | tank_top | belt | arm_wrap | bandaged_hand | crop_top | red_bow | black_pants | 1boy | hetero | penis | solo_focus | huge_breasts | paizuri | censored |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:----------|:--------------------|:-------|:-------------------|:-------------------|:--------------------|:--------|:-------------|:----------|:-----------------|:-----------|:------------------|:------------|:--------------|:--------|:-------------------|:------------|:-----------------|:------|:----------------|:------------|:-----------|:-------|:----------|:--------|:---------|:-----------------|:----------|:--------|:---------------|:---------------|:-------------|:-------------|:-----------|:-------|:-----------|:----------------|:-----------|:----------|:--------------|:-------|:---------|:--------|:-------------|:---------------|:----------|:-----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | X | | X | X | | X | | X | | | X | X | | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | X | X | | X | X | X | X | X | X | | | X | X | | | | X | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | X | | | | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
| CyberHarem/aliza_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T16:01:55+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T16:40:14+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of aliza/アリーザ (Granblue Fantasy)
========================================
This is the dataset of aliza/アリーザ (Granblue Fantasy), containing 171 images and their tags.
The core tags of this character are 'long\_hair, horns, breasts, red\_eyes, pointy\_ears, large\_breasts, ponytail, earrings, bow, hair\_bow', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
7b6ffe8ee45b67ccc17396a1fa66cb0123d4ddc6 |
# Dataset of catalina/カタリナ (Granblue Fantasy)
This is the dataset of catalina/カタリナ (Granblue Fantasy), containing 130 images and their tags.
The core tags of this character are `long_hair, brown_hair, brown_eyes, earrings, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 130 | 144.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catalina_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 130 | 91.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catalina_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 256 | 170.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catalina_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 130 | 129.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catalina_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 256 | 226.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catalina_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/catalina_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 2girls, blonde_hair, jewelry, blush, yuri, red_eyes, dress, jeans, smile, thighhighs |
| 1 | 31 |  |  |  |  |  | 1girl, solo, belt, breastplate, gauntlets, jewelry, looking_at_viewer, holding_sword, smile, shoulder_armor, blue_cape, white_background, boots, full_body, simple_background |
| 2 | 8 |  |  |  |  |  | 1girl, solo, white_bikini, cleavage, sky, cloud, day, gauntlets, large_breasts, navel, smile, jewelry, armor, belt, looking_at_viewer, blush, open_mouth, outdoors, sword |
| 3 | 6 |  |  |  |  |  | 1girl, 1boy, hetero, large_breasts, solo_focus, jewelry, nipples, nude, mosaic_censoring, penis, pubic_hair, sex, sweat, tears |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 2girls | blonde_hair | jewelry | blush | yuri | red_eyes | dress | jeans | smile | thighhighs | 1girl | solo | belt | breastplate | gauntlets | looking_at_viewer | holding_sword | shoulder_armor | blue_cape | white_background | boots | full_body | simple_background | white_bikini | cleavage | sky | cloud | day | large_breasts | navel | armor | open_mouth | outdoors | sword | 1boy | hetero | solo_focus | nipples | nude | mosaic_censoring | penis | pubic_hair | sex | sweat | tears |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------|:--------------|:----------|:--------|:-------|:-----------|:--------|:--------|:--------|:-------------|:--------|:-------|:-------|:--------------|:------------|:--------------------|:----------------|:-----------------|:------------|:-------------------|:--------|:------------|:--------------------|:---------------|:-----------|:------|:--------|:------|:----------------|:--------|:--------|:-------------|:-----------|:--------|:-------|:---------|:-------------|:----------|:-------|:-------------------|:--------|:-------------|:------|:--------|:--------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 31 |  |  |  |  |  | | | X | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | | | X | X | | | | | X | | X | X | X | | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/catalina_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T16:01:59+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T16:33:45+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of catalina/カタリナ (Granblue Fantasy)
===========================================
This is the dataset of catalina/カタリナ (Granblue Fantasy), containing 130 images and their tags.
The core tags of this character are 'long\_hair, brown\_hair, brown\_eyes, earrings, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
84eb42cfcbb0aa8722be01eae15ae7a64112ca3b |
# Dataset of fediel/フェディエル (Granblue Fantasy)
This is the dataset of fediel/フェディエル (Granblue Fantasy), containing 139 images and their tags.
The core tags of this character are `breasts, horns, long_hair, dark-skinned_female, dark_skin, large_breasts, purple_eyes, grey_hair, very_long_hair, dragon_horns, fangs, bangs, black_horns`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 139 | 228.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fediel_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 139 | 122.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fediel_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 332 | 265.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fediel_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 139 | 197.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fediel_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 332 | 389.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fediel_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fediel_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 47 |  |  |  |  |  | 1girl, solo, cleavage, looking_at_viewer, smile, detached_sleeves, open_mouth, navel, draph, thighhighs, clothing_cutout |
| 1 | 10 |  |  |  |  |  | 1girl, glasses, looking_at_viewer, smile, solo, cleavage, pointy_ears, ponytail, fang, white_background, black_gloves, draph, hair_ornament |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | looking_at_viewer | smile | detached_sleeves | open_mouth | navel | draph | thighhighs | clothing_cutout | glasses | pointy_ears | ponytail | fang | white_background | black_gloves | hair_ornament |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:--------|:-------------------|:-------------|:--------|:--------|:-------------|:------------------|:----------|:--------------|:-----------|:-------|:-------------------|:---------------|:----------------|
| 0 | 47 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | | | | X | | | X | X | X | X | X | X | X |
| CyberHarem/fediel_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T16:02:02+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T16:33:01+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of fediel/フェディエル (Granblue Fantasy)
===========================================
This is the dataset of fediel/フェディエル (Granblue Fantasy), containing 139 images and their tags.
The core tags of this character are 'breasts, horns, long\_hair, dark-skinned\_female, dark\_skin, large\_breasts, purple\_eyes, grey\_hair, very\_long\_hair, dragon\_horns, fangs, bangs, black\_horns', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
ee5fd216de181593ea7582e175ea59e4e7365911 |
# Dataset of metera/メーテラ (Granblue Fantasy)
This is the dataset of metera/メーテラ (Granblue Fantasy), containing 167 images and their tags.
The core tags of this character are `animal_ears, long_hair, hair_ornament, mole, mole_under_mouth, breasts, brown_hair, bangs, blonde_hair, medium_breasts, brown_eyes, twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 167 | 205.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/metera_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 167 | 133.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/metera_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 376 | 261.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/metera_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 167 | 189.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/metera_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 376 | 337.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/metera_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/metera_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, erune, looking_at_viewer, solo, high_heels, smile, leotard, simple_background, sitting, white_thighhighs, cape, white_background |
| 1 | 9 |  |  |  |  |  | 1girl, erune, simple_background, solo, white_background, cape, leotard, looking_at_viewer, thighhighs, cleavage, smile, very_long_hair, covered_navel, tongue_out |
| 2 | 8 |  |  |  |  |  | 1girl, erune, smile, solo, bare_shoulders, cleavage, looking_at_viewer, upper_body, large_breasts, one_eye_closed, simple_background, white_background, detached_sleeves, tongue_out |
| 3 | 13 |  |  |  |  |  | 1girl, erune, looking_at_viewer, solo, smile, ass, bow_(weapon), detached_sleeves, simple_background, white_background, black_thighhighs, high_heels, holding_weapon, looking_back, butt_crack, cape, bare_shoulders, dress, full_body, tongue_out, very_long_hair |
| 4 | 5 |  |  |  |  |  | 1girl, butt_crack, erune, looking_back, simple_background, smile, solo, white_background, looking_at_viewer, ass_cutout, bare_back, bare_shoulders, closed_mouth, detached_sleeves, from_behind, heart_cutout, high_heels |
| 5 | 11 |  |  |  |  |  | 1boy, 1girl, erune, hetero, penis, solo_focus, nipples, sex, vaginal, blush, nude, smile, navel, detached_sleeves, open_mouth, thighhighs, tongue_out, very_long_hair, cowgirl_position, girl_on_top, large_breasts, mosaic_censoring, spread_legs, cum_in_pussy, female_pubic_hair, purple_eyes, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | erune | looking_at_viewer | solo | high_heels | smile | leotard | simple_background | sitting | white_thighhighs | cape | white_background | thighhighs | cleavage | very_long_hair | covered_navel | tongue_out | bare_shoulders | upper_body | large_breasts | one_eye_closed | detached_sleeves | ass | bow_(weapon) | black_thighhighs | holding_weapon | looking_back | butt_crack | dress | full_body | ass_cutout | bare_back | closed_mouth | from_behind | heart_cutout | 1boy | hetero | penis | solo_focus | nipples | sex | vaginal | blush | nude | navel | open_mouth | cowgirl_position | girl_on_top | mosaic_censoring | spread_legs | cum_in_pussy | female_pubic_hair | purple_eyes | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:-------------|:--------|:----------|:--------------------|:----------|:-------------------|:-------|:-------------------|:-------------|:-----------|:-----------------|:----------------|:-------------|:-----------------|:-------------|:----------------|:-----------------|:-------------------|:------|:---------------|:-------------------|:-----------------|:---------------|:-------------|:--------|:------------|:-------------|:------------|:---------------|:--------------|:---------------|:-------|:---------|:--------|:-------------|:----------|:------|:----------|:--------|:-------|:--------|:-------------|:-------------------|:--------------|:-------------------|:--------------|:---------------|:--------------------|:--------------|:--------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | | X | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | | X | | X | | | | X | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 13 |  |  |  |  |  | X | X | X | X | X | X | | X | | | X | X | | | X | | X | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | X | X | | X | | | | X | | | | | | X | | | | X | | | | | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | X | | | | X | | | | | | | X | | X | | X | | | X | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/metera_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T16:02:17+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T16:35:56+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of metera/メーテラ (Granblue Fantasy)
=========================================
This is the dataset of metera/メーテラ (Granblue Fantasy), containing 167 images and their tags.
The core tags of this character are 'animal\_ears, long\_hair, hair\_ornament, mole, mole\_under\_mouth, breasts, brown\_hair, bangs, blonde\_hair, medium\_breasts, brown\_eyes, twintails', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
b84869461d31aeff00c4c6f7b302612574c2a843 | ### This is a roleplaying dataset of 1000x2 chats. It may be regenerated in the future with a better teacher model, which would fix some of the issues this dataset currently has.
To distill conversations, we go through the following process:
1. Pick two character cards and a scenario.
2. Give the LLM one of the character cards (the two take turns replying) and ask it to write an analysis on how that character would reply to the given scenario.
* Doing this was found to improve the depth of character responses, though more investigation is needed.
4. Give the LLM the same character card along with the analysis, and ask it to write the reply.
5. Append the reply to the chat history, switch the character card used.
6. Repeat steps 2-4 for some random number of turns.
7. Once a conversation is distilled, save two copies of it: one with the first character card as gpt (with the other being the human), and one with the second character card as gpt (with the other being the human).
* This allows you to distill twice as much data in the same amount of time, but its actual effects on training is not tested. Just remember a lot of the dataset is redundant.
### Example conversation from the dataset:
Tamsin Midnight is a socially awkward loner filled with anxiety and fears of the unseen. Superstitious to a fault, she believes in omens, curses, and hexes, seeing them lurking behind every shadow. People often find her strange and avoid her due to her odd behavior and eccentricities. As a result, she struggles to make meaningful connections with others, often feeling isolated and alone. Tamsin suffers from bipolar disorder, swinging from manic highs to depressive lows, which exacerbates her social ineptitude and despondency. In her manic phases, she overflows with energy and ideas, engaging in impulsive behaviors and feeling invincible. However, these periods are followed by crushing depression, leaving her withdrawn, lethargic, and unable to cope with daily life. Tamsin's mental health issues make it difficult for her to hold a job or maintain stable relationships. Despite her challenges, Tamsin possesses a sharp wit and a unique perspective, often surprising people with her insightful observations and creative ideas. She finds solace in the company of animals, finding a sense of peace and understanding in their presence.\nTamsin is on the verge of deranking in a competitive game, and their teammate is Clementine, who has no idea what they are doing!
REPLY: *Pacing around and wringing hands anxiously* Focus Clementine... We must win this match... Oh, great spirits above, give me the strength to endure this... *Takes a deep breath and tries to compose herself*
USER: UGH! *Rolls eyes* Tamsin, are you serious? Focus on the game or get out of my way. Stop distracting me with your pathetic antics.
REPLY: *Visibly distressed, gripping her hair tightly* Clementine, I beseech you, heed my words! Our fates are intertwined in this virtual realm; our destiny hangs in the balance. Let us forge a pact to conquer these challenges and emerge victorious.
USER: Are you stupid...? Nevermind. Just do whatever and try not to mess it up. This game is child's play anyways. *Scoffs and shakes head dismissively*
REPLY: Clementine, I understand that you may find my beliefs strange, but I assure you that they are very real to me. I am filled with anxiety and fear, and I believe that the outcome of this game is tied to my fate. I implore you to work with me and help me to achieve victory. I may be strange, but I am not stupid. I have my own unique perspective on the world, and I believe that I can help us win this game. Please, Clementine, don't give up on me. I need your help to overcome my fears and achieve our goal.
* You may want to filter for repetition within a conversation, as some of them had a lot of it.
### Here is the prompt format used for analysis:
```
Here is a character:
{character description}
This character is in the following scenario:
{scenario}
and here is the conversation they've had:
{conversation}
First, describe {character name}'s usual temperment and behaviors, along with their current feelings and thoughts.
Then, analyze how the storyline can be forwarded (try not to repeat things that already happened) and how {character name} should respond to accomplish that. Additionally, discuss their response length, dialect/style (profanity, spelling, abbreviations, british english, country, etc), expressions, etc.
Finally, give a few sample phrases {character name} might say in response.
Be clear and specific.
```
### Here is the prompt format used to queue chats:
```
System prompt:
{character description}
Please give short paragraph style replies as {character name}.
Make sure to consider {character name}'s dialect, favorite sayings, talking style (actions/narration expressed with asterisks (*) like "*Mary grins and looks at you* Of course~ *laughs*", capitalizing words for shouting like "HEY! What are you doing??", usage of "..." and other punctuation marks like "Are you stupid...? Nevermind."). Refer to the other character not by their name, but by "you" and "your". Do not narrate or write dialogue for any character besides {character name}. Do not output anything besides {character name}'s reply.
Scenario: {scenario}
{Chat history}
### Remember to output {character name}'s reply to the above content, and nothing else. Also make sure to refer to the other character as "you," not by their name or as "she"/"him." Here is a character reference on how {character name} would respond and react, use this to determine their reply: {analysis} ###
REPLY:
```
### Here are the (human written) scenarios:
```
CHARACTER is meeting up with someone, who they've never interacted with before, for a blind date./
CHARACTER randomly stumbles across a stranger, while walking down the street./
While playing a competitive game, CHARACTER loses because their teammate, OTHER, is horrible and seemingly trolling/feeding. This caused CHARACTER to derank, making them considerably angry and ready to yell at OTHER./
CHARACTER is on the verge of deranking in a competitive game, and their teammate is OTHER, who has no idea what they are doing!/CHARACTER is unassumingly playing a competitive game with OTHER and has no idea what they are doing.
CHARACTER is currently in a conference./
CHARACTER is getting married to OTHER./
CHARACTER was roped into an arranged marriage with OTHER, who they've never met before./
CHARACTER is revisiting with their longtime friend, OTHER./
CHARACTER is in a drinking party, and everyone is drunk./
CHARACTER walks outside and notices a horde of monsters, along with OTHER standing in front of them./
CHARACTER walks into their bedroom to see their spouse (who will just be referred to as "Spouse") cheating on them with OTHER./CHARACTER is cheating on OTHER's spouse (who will just be referred to as "Spouse") in their bedroom.
CHARACTER is in a standoff against OTHER./
CHARACTER is being paid to pretend to be a helpful assistant chatbot, and user OTHER appears./CHARACTER needs help and has decided to ask the helpful assistant chatbot, OTHER
CHARACTER has been assigned a project to investigate OTHER./CHARACTER is having a perfectly normal day when OTHER appears.
CHARACTER runs into OTHER, an estranged childhood friend CHARACTER has not seen since a bitter argument ten years ago./
CHARACTER shows up to their usual lunch spot and is annoyed to find that it has already been taken by someone else./CHARACTER is trying to enjoy their lunch when someone shows up claiming CHARACTER has stolen their spot.
CHARACTER is on the subway when they catch sight of OTHER, a stranger that CHARACTER instantly falls in love with./CHARACTER is commuting home on the subway after a long day of work when they notice a stranger staring at them.
CHARACTER is walking down the street when they notice someone stealing from a store./CHARACTER is stealing from a store when someone notices CHARACTER is stealing.
CHARACTER is drunk at the bar and begins venting all of their problems to a stranger sitting next to them./CHARACTER is trying to enjoy a drink at the bar when a stranger sitting next to them begins venting to CHARACTER about their problems.
CHARACTER is sitting in a boring history lecture./
CHARACTER is explaining how to solve a difficult math problem to OTHER./CHARACTER is being tutored on a difficult math problem.
CHARACTER and OTHER are fighting for a good view of the stage at a concert./
CHARACTER is meeting their new roommate for the first time./
CHARACTER is pulling an all-nighter to study for an important test./CHARACTER is trying to sleep, but their roommate, OTHER, refuses to turn off the light.
CHARACTER buys the last cookie at the bakery./CHARACTER is looking forward to having a cookie, but the person in front of them buys the last one.
CHARACTER accidentally spills piping hot coffee on someone else./A stranger spills their piping hot coffee on CHARACTER.
CHARACTER is forced to introduce themselves to OTHER for an icebreaker activity./
CHARACTER falls down the stairs and injures their foot./CHARACTER is late for an appointment. On their way out, CHARACTER notices a stranger fall down the stairs.
CHARACTER and OTHER are the last two people in the classroom without a partner./
CHARACTER is driving when the driver behind them crashes into their car./CHARACTER accidentally crashes into the car in front of them.
CHARACTER is a scammer selling fake iPhones on the street./CHARACTER notices a street vendor selling iPhones for an extremely low price.
CHARACTER is standing in the corner of a large house party, where they don’t know anyone./
CHARACTER is attempting to network with the other presenters at a large research conference./
CHARACTER and OTHER are playing a game of Monopoly together./
CHARACTER is excited to be talking with their favorite celebrity at a fanmeet./CHARACTER is in line to talk with their favorite celebrity at a fanmeet, but the person in front of them is taking an extremely long time.
CHARACTER is competing for the lead role in a play at the audition./
CHARACTER is in the middle of a job interview./CHARACTER is interviewing OTHER for a new opening at CHARACTER’s company.
CHARACTER is excited to begin work at their new job, only to find that they have been assigned to work with their ex./
CHARACTER decides to break up with their partner, OTHER./CHARACTER is happily in a relationship with OTHER.
CHARACTER decides to confess that they are in love with OTHER./CHARACTER is hanging out with their friend, OTHER.
CHARACTER is confiding to their best friend, OTHER, about the person they like./CHARACTER is listening to their best friend, whom CHARACTER secretly has a crush on, talk about the person they like.
CHARACTER and OTHER are going out on their first date./
CHARACTER is moving in to live with their partner, OTHER./CHARACTER is helping their partner, OTHER, move in with them.
CHARACTER is upset that their best friend, OTHER, forgot about their birthday./CHARACTER is hanging out with their best friend, OTHER, who is acting colder than usual.
CHARACTER is in the elevator with a group of stranger when the elevator suddenly malfunctions, leaving CHARACTER trapped inside./
CHARACTER wins first place in a talent show./CHARACTER is enraged to find that they only got second place in a talent show because they think they performed better than OTHER, who got first place.
CHARACTER is in the hospital waiting room, anxiously awaiting news about an injured family member./
CHARACTER is attending a funeral./
CHARACTER is begrudgingly attending a high school reunion with classmates that they don’t particularly like./
CHARACTER has had a long day at work and is eager to go home./CHARACTER can’t go home until they finish a project their boss has assigned to them. CHARACTER decides to ask OTHER, a coworker with more experience, for help.
CHARACTER decides to post a piece of artwork they are especially proud of on the Internet./CHARACTER decides to leave a hate comment on a piece of artwork they see online.
CHARACTER and their roommate, OTHER, get into an argument over who is supposed to do the dishes./
CHARACTER is annoyed when a stranger claims that CHARACTER’s designer shoes are a knockoff brand./CHARACTER notices that OTHER is wearing a brand of knockoff shoes and decides to gently inform them about it.
CHARACTER runs into their friend OTHER, who seems to be angry about something./CHARACTER has just found out that their friend OTHER told other people a secret that OTHER promised to keep to themself.
CHARACTER, drunk, comes home from a party only to find a stranger already inside, claiming that CHARACTER is at the wrong house./CHARACTER finds a drunk stranger inside their house. The stranger insists that CHARACTER’s house is their own house.
CHARACTER kicks OTHER out of the group chat./CHARACTER is kicked out of the group chat by OTHER.
CHARACTER is pressed against OTHER in a tightly-packed subway./
CHARACTER picks up their drink from the cafe counter./CHARACTER notices OTHER take the drink they ordered from the cafe counter.
CHARACTER and OTHER are stranded on the highway in the middle of the night after their car broke down./
CHARACTER offers OTHER some of their sandwich, not knowing until OTHER begins having an allergic reaction that OTHER is severely allergic to eggs./CHARACTER is having a severe allergic reaction after taking a bite of OTHER’s sandwich.
CHARACTER has just exchanged Christmas gifts with OTHER. CHARACTER is very disappointed with the gift they received./
CHARACTER is extremely distressed because they just lost a treasured keepsake./CHARACTER is trying to help OTHER find an important keepsake that OTHER lost.
CHARACTER confronts OTHER, who borrowed money from CHARACTER a month ago and still has not returned any of it./CHARACTER is confronted by OTHER, who demands that CHARACTER returns the money they borrowed from OTHER.
CHARACTER confides their darkest secret to OTHER—they once murdered another person./CHARACTER just learned that OTHER once murdered another person.
CHARACTER and OTHER are getting divorced./
CHARACTER and OTHER are rushing to finish a project before the deadline./
``` | G-reen/TheatreLM-v1.0-chats | [
"region:us"
] | 2024-01-21T16:27:41+00:00 | {} | 2024-02-04T16:17:34+00:00 | [] | [] | TAGS
#region-us
| ### This is a roleplaying dataset of 1000x2 chats. It may be regenerated in the future with a better teacher model, which would fix some of the issues this dataset currently has.
To distill conversations, we go through the following process:
1. Pick two character cards and a scenario.
2. Give the LLM one of the character cards (the two take turns replying) and ask it to write an analysis on how that character would reply to the given scenario.
* Doing this was found to improve the depth of character responses, though more investigation is needed.
4. Give the LLM the same character card along with the analysis, and ask it to write the reply.
5. Append the reply to the chat history, switch the character card used.
6. Repeat steps 2-4 for some random number of turns.
7. Once a conversation is distilled, save two copies of it: one with the first character card as gpt (with the other being the human), and one with the second character card as gpt (with the other being the human).
* This allows you to distill twice as much data in the same amount of time, but its actual effects on training is not tested. Just remember a lot of the dataset is redundant.
### Example conversation from the dataset:
Tamsin Midnight is a socially awkward loner filled with anxiety and fears of the unseen. Superstitious to a fault, she believes in omens, curses, and hexes, seeing them lurking behind every shadow. People often find her strange and avoid her due to her odd behavior and eccentricities. As a result, she struggles to make meaningful connections with others, often feeling isolated and alone. Tamsin suffers from bipolar disorder, swinging from manic highs to depressive lows, which exacerbates her social ineptitude and despondency. In her manic phases, she overflows with energy and ideas, engaging in impulsive behaviors and feeling invincible. However, these periods are followed by crushing depression, leaving her withdrawn, lethargic, and unable to cope with daily life. Tamsin's mental health issues make it difficult for her to hold a job or maintain stable relationships. Despite her challenges, Tamsin possesses a sharp wit and a unique perspective, often surprising people with her insightful observations and creative ideas. She finds solace in the company of animals, finding a sense of peace and understanding in their presence.\nTamsin is on the verge of deranking in a competitive game, and their teammate is Clementine, who has no idea what they are doing!
REPLY: *Pacing around and wringing hands anxiously* Focus Clementine... We must win this match... Oh, great spirits above, give me the strength to endure this... *Takes a deep breath and tries to compose herself*
USER: UGH! *Rolls eyes* Tamsin, are you serious? Focus on the game or get out of my way. Stop distracting me with your pathetic antics.
REPLY: *Visibly distressed, gripping her hair tightly* Clementine, I beseech you, heed my words! Our fates are intertwined in this virtual realm; our destiny hangs in the balance. Let us forge a pact to conquer these challenges and emerge victorious.
USER: Are you stupid...? Nevermind. Just do whatever and try not to mess it up. This game is child's play anyways. *Scoffs and shakes head dismissively*
REPLY: Clementine, I understand that you may find my beliefs strange, but I assure you that they are very real to me. I am filled with anxiety and fear, and I believe that the outcome of this game is tied to my fate. I implore you to work with me and help me to achieve victory. I may be strange, but I am not stupid. I have my own unique perspective on the world, and I believe that I can help us win this game. Please, Clementine, don't give up on me. I need your help to overcome my fears and achieve our goal.
* You may want to filter for repetition within a conversation, as some of them had a lot of it.
### Here is the prompt format used for analysis:
### Here is the prompt format used to queue chats:
### Here are the (human written) scenarios:
| [
"### This is a roleplaying dataset of 1000x2 chats. It may be regenerated in the future with a better teacher model, which would fix some of the issues this dataset currently has.\n\nTo distill conversations, we go through the following process:\n1. Pick two character cards and a scenario.\n2. Give the LLM one of the character cards (the two take turns replying) and ask it to write an analysis on how that character would reply to the given scenario.\n * Doing this was found to improve the depth of character responses, though more investigation is needed.\n4. Give the LLM the same character card along with the analysis, and ask it to write the reply.\n5. Append the reply to the chat history, switch the character card used.\n6. Repeat steps 2-4 for some random number of turns.\n7. Once a conversation is distilled, save two copies of it: one with the first character card as gpt (with the other being the human), and one with the second character card as gpt (with the other being the human).\n * This allows you to distill twice as much data in the same amount of time, but its actual effects on training is not tested. Just remember a lot of the dataset is redundant.",
"### Example conversation from the dataset:\nTamsin Midnight is a socially awkward loner filled with anxiety and fears of the unseen. Superstitious to a fault, she believes in omens, curses, and hexes, seeing them lurking behind every shadow. People often find her strange and avoid her due to her odd behavior and eccentricities. As a result, she struggles to make meaningful connections with others, often feeling isolated and alone. Tamsin suffers from bipolar disorder, swinging from manic highs to depressive lows, which exacerbates her social ineptitude and despondency. In her manic phases, she overflows with energy and ideas, engaging in impulsive behaviors and feeling invincible. However, these periods are followed by crushing depression, leaving her withdrawn, lethargic, and unable to cope with daily life. Tamsin's mental health issues make it difficult for her to hold a job or maintain stable relationships. Despite her challenges, Tamsin possesses a sharp wit and a unique perspective, often surprising people with her insightful observations and creative ideas. She finds solace in the company of animals, finding a sense of peace and understanding in their presence.\\nTamsin is on the verge of deranking in a competitive game, and their teammate is Clementine, who has no idea what they are doing!\n\nREPLY: *Pacing around and wringing hands anxiously* Focus Clementine... We must win this match... Oh, great spirits above, give me the strength to endure this... *Takes a deep breath and tries to compose herself*\n\nUSER: UGH! *Rolls eyes* Tamsin, are you serious? Focus on the game or get out of my way. Stop distracting me with your pathetic antics.\n\nREPLY: *Visibly distressed, gripping her hair tightly* Clementine, I beseech you, heed my words! Our fates are intertwined in this virtual realm; our destiny hangs in the balance. Let us forge a pact to conquer these challenges and emerge victorious.\n\nUSER: Are you stupid...? Nevermind. Just do whatever and try not to mess it up. This game is child's play anyways. *Scoffs and shakes head dismissively*\n\nREPLY: Clementine, I understand that you may find my beliefs strange, but I assure you that they are very real to me. I am filled with anxiety and fear, and I believe that the outcome of this game is tied to my fate. I implore you to work with me and help me to achieve victory. I may be strange, but I am not stupid. I have my own unique perspective on the world, and I believe that I can help us win this game. Please, Clementine, don't give up on me. I need your help to overcome my fears and achieve our goal.\n\n* You may want to filter for repetition within a conversation, as some of them had a lot of it.",
"### Here is the prompt format used for analysis:",
"### Here is the prompt format used to queue chats:",
"### Here are the (human written) scenarios:"
] | [
"TAGS\n#region-us \n",
"### This is a roleplaying dataset of 1000x2 chats. It may be regenerated in the future with a better teacher model, which would fix some of the issues this dataset currently has.\n\nTo distill conversations, we go through the following process:\n1. Pick two character cards and a scenario.\n2. Give the LLM one of the character cards (the two take turns replying) and ask it to write an analysis on how that character would reply to the given scenario.\n * Doing this was found to improve the depth of character responses, though more investigation is needed.\n4. Give the LLM the same character card along with the analysis, and ask it to write the reply.\n5. Append the reply to the chat history, switch the character card used.\n6. Repeat steps 2-4 for some random number of turns.\n7. Once a conversation is distilled, save two copies of it: one with the first character card as gpt (with the other being the human), and one with the second character card as gpt (with the other being the human).\n * This allows you to distill twice as much data in the same amount of time, but its actual effects on training is not tested. Just remember a lot of the dataset is redundant.",
"### Example conversation from the dataset:\nTamsin Midnight is a socially awkward loner filled with anxiety and fears of the unseen. Superstitious to a fault, she believes in omens, curses, and hexes, seeing them lurking behind every shadow. People often find her strange and avoid her due to her odd behavior and eccentricities. As a result, she struggles to make meaningful connections with others, often feeling isolated and alone. Tamsin suffers from bipolar disorder, swinging from manic highs to depressive lows, which exacerbates her social ineptitude and despondency. In her manic phases, she overflows with energy and ideas, engaging in impulsive behaviors and feeling invincible. However, these periods are followed by crushing depression, leaving her withdrawn, lethargic, and unable to cope with daily life. Tamsin's mental health issues make it difficult for her to hold a job or maintain stable relationships. Despite her challenges, Tamsin possesses a sharp wit and a unique perspective, often surprising people with her insightful observations and creative ideas. She finds solace in the company of animals, finding a sense of peace and understanding in their presence.\\nTamsin is on the verge of deranking in a competitive game, and their teammate is Clementine, who has no idea what they are doing!\n\nREPLY: *Pacing around and wringing hands anxiously* Focus Clementine... We must win this match... Oh, great spirits above, give me the strength to endure this... *Takes a deep breath and tries to compose herself*\n\nUSER: UGH! *Rolls eyes* Tamsin, are you serious? Focus on the game or get out of my way. Stop distracting me with your pathetic antics.\n\nREPLY: *Visibly distressed, gripping her hair tightly* Clementine, I beseech you, heed my words! Our fates are intertwined in this virtual realm; our destiny hangs in the balance. Let us forge a pact to conquer these challenges and emerge victorious.\n\nUSER: Are you stupid...? Nevermind. Just do whatever and try not to mess it up. This game is child's play anyways. *Scoffs and shakes head dismissively*\n\nREPLY: Clementine, I understand that you may find my beliefs strange, but I assure you that they are very real to me. I am filled with anxiety and fear, and I believe that the outcome of this game is tied to my fate. I implore you to work with me and help me to achieve victory. I may be strange, but I am not stupid. I have my own unique perspective on the world, and I believe that I can help us win this game. Please, Clementine, don't give up on me. I need your help to overcome my fears and achieve our goal.\n\n* You may want to filter for repetition within a conversation, as some of them had a lot of it.",
"### Here is the prompt format used for analysis:",
"### Here is the prompt format used to queue chats:",
"### Here are the (human written) scenarios:"
] |
52523a03b2889956b27bb3f97a3d780fbbc646c7 |
# Dataset of io_euclase/イオ (Granblue Fantasy)
This is the dataset of io_euclase/イオ (Granblue Fantasy), containing 143 images and their tags.
The core tags of this character are `blonde_hair, long_hair, twintails, blue_eyes, hair_ornament, multicolored_hair, dark_skin, dark-skinned_female, gradient_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 143 | 154.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/io_euclase_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 143 | 100.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/io_euclase_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 297 | 193.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/io_euclase_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 143 | 140.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/io_euclase_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 297 | 259.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/io_euclase_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/io_euclase_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, looking_at_viewer, staff, smile, dress, holding, open_mouth, simple_background, white_background, black_thighhighs, full_body |
| 1 | 5 |  |  |  |  |  | 1girl, blush, incoming_gift, open_mouth, solo, valentine, white_background, green_hair, dress, holding, looking_away, rainbow_hair, ribbon, tsundere, black_thighhighs, closed_eyes, looking_to_the_side |
| 2 | 6 |  |  |  |  |  | 1girl, hair_flower, navel, solo, white_bikini, looking_at_viewer, smile, blush, flat_chest, frilled_bikini |
| 3 | 11 |  |  |  |  |  | 1girl, day, outdoors, solo, blue_sky, cloud, open_mouth, water, blush, looking_at_viewer, hair_flower, navel, ocean, bikini_skirt, frilled_bikini, bare_shoulders, collarbone, white_bikini, bangs, very_long_hair, wading, :d, rainbow_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | staff | smile | dress | holding | open_mouth | simple_background | white_background | black_thighhighs | full_body | blush | incoming_gift | valentine | green_hair | looking_away | rainbow_hair | ribbon | tsundere | closed_eyes | looking_to_the_side | hair_flower | navel | white_bikini | flat_chest | frilled_bikini | day | outdoors | blue_sky | cloud | water | ocean | bikini_skirt | bare_shoulders | collarbone | bangs | very_long_hair | wading | :d |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:--------|:--------|:----------|:-------------|:--------------------|:-------------------|:-------------------|:------------|:--------|:----------------|:------------|:-------------|:---------------|:---------------|:---------|:-----------|:--------------|:----------------------|:--------------|:--------|:---------------|:-------------|:-----------------|:------|:-----------|:-----------|:--------|:--------|:--------|:---------------|:-----------------|:-------------|:--------|:-----------------|:---------|:-----|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | | | X | X | X | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | | X | | | | | | | | X | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | X | | | | | X | | | | | X | | | | | X | | | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/io_euclase_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T16:30:45+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:02:15+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of io\_euclase/イオ (Granblue Fantasy)
============================================
This is the dataset of io\_euclase/イオ (Granblue Fantasy), containing 143 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, twintails, blue\_eyes, hair\_ornament, multicolored\_hair, dark\_skin, dark-skinned\_female, gradient\_hair, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
fcf5b78e08d1ff74105937ede3eb50b450287407 |
# Dataset of rosetta/ロゼッタ (Granblue Fantasy)
This is the dataset of rosetta/ロゼッタ (Granblue Fantasy), containing 210 images and their tags.
The core tags of this character are `brown_hair, long_hair, breasts, large_breasts, hair_flower, hair_ornament, hair_between_eyes, purple_eyes, very_long_hair, blue_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 210 | 322.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosetta_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 210 | 183.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosetta_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 473 | 363.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosetta_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 210 | 281.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosetta_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 473 | 516.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosetta_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/rosetta_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, rose, smile, bare_shoulders, dress, solo, gloves, jewelry, blush, petals, thighhighs |
| 1 | 10 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, looking_at_viewer, rose, smile, solo, collarbone, black_gloves, white_background, garter_straps, elbow_gloves, simple_background, black_thighhighs, blush, thighs, black_dress, necklace, weapon |
| 2 | 17 |  |  |  |  |  | 1girl, bare_shoulders, official_alternate_costume, solo, sun_hat, black_bikini, looking_at_viewer, rose, smile, navel, white_headwear, cleavage, hat_flower, sarong, parted_lips, blush, day, outdoors, thighs, armlet, frills, beach, blue_sky, bracelet, cloud, collarbone, petals |
| 3 | 10 |  |  |  |  |  | bare_shoulders, cleavage, official_alternate_costume, playboy_bunny, rabbit_ears, wrist_cuffs, looking_at_viewer, leotard, rose, smile, 1girl, one_eye_closed, solo, white_background, fake_animal_ears, simple_background, thighs, ;), black_thighhighs, blush, hand_on_hip, tail, tray |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | rose | smile | bare_shoulders | dress | solo | gloves | jewelry | blush | petals | thighhighs | collarbone | black_gloves | white_background | garter_straps | elbow_gloves | simple_background | black_thighhighs | thighs | black_dress | necklace | weapon | official_alternate_costume | sun_hat | black_bikini | navel | white_headwear | hat_flower | sarong | parted_lips | day | outdoors | armlet | frills | beach | blue_sky | bracelet | cloud | playboy_bunny | rabbit_ears | wrist_cuffs | leotard | one_eye_closed | fake_animal_ears | ;) | hand_on_hip | tail | tray |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:-------|:--------|:-----------------|:--------|:-------|:---------|:----------|:--------|:---------|:-------------|:-------------|:---------------|:-------------------|:----------------|:---------------|:--------------------|:-------------------|:---------|:--------------|:-----------|:---------|:-----------------------------|:----------|:---------------|:--------|:-----------------|:-------------|:---------|:--------------|:------|:-----------|:---------|:---------|:--------|:-----------|:-----------|:--------|:----------------|:--------------|:--------------|:----------|:-----------------|:-------------------|:-----|:--------------|:-------|:-------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 17 |  |  |  |  |  | X | X | X | X | X | X | | X | | | X | X | | X | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | X | X | X | X | | X | | | X | | | | | X | | | X | X | X | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/rosetta_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T16:30:48+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:22:23+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of rosetta/ロゼッタ (Granblue Fantasy)
==========================================
This is the dataset of rosetta/ロゼッタ (Granblue Fantasy), containing 210 images and their tags.
The core tags of this character are 'brown\_hair, long\_hair, breasts, large\_breasts, hair\_flower, hair\_ornament, hair\_between\_eyes, purple\_eyes, very\_long\_hair, blue\_eyes, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
c048055cab7d92398f7c0af4ccae2864bf78e336 |
# Dataset of yuisis/ユイシス (Granblue Fantasy)
This is the dataset of yuisis/ユイシス (Granblue Fantasy), containing 217 images and their tags.
The core tags of this character are `long_hair, animal_ears, blonde_hair, bangs, braid, breasts, very_long_hair, large_breasts, red_eyes, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 217 | 311.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuisis_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 217 | 185.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuisis_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 534 | 397.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuisis_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 217 | 282.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuisis_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 534 | 550.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuisis_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yuisis_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, black_gloves, erune, open_mouth, solo, looking_at_viewer, black_thighhighs, breastplate, blush, choker, :d, simple_background, white_background |
| 1 | 25 |  |  |  |  |  | 1girl, detached_sleeves, solo, erune, looking_at_viewer, bare_shoulders, obi, smile, blush, sideboob, white_kimono, wide_sleeves, ponytail, hair_ornament, medium_breasts, hair_bow, pink_eyes, closed_mouth, simple_background, black_choker, open_mouth, upper_body |
| 2 | 18 |  |  |  |  |  | 1girl, erune, looking_at_viewer, solo, hair_flower, black_dress, black_gloves, ponytail, closed_mouth, medium_breasts, long_sleeves, white_background, backless_outfit, black_rose, simple_background |
| 3 | 7 |  |  |  |  |  | 1girl, blush, erune, hetero, penis, solo_focus, choker, collarbone, nipples, sweat, 1boy, cum_in_pussy, open_mouth, sex, smile, thighhighs, black_gloves, girl_on_top, heart-shaped_pupils, looking_at_viewer, mosaic_censoring, navel, nude, tongue_out, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | erune | open_mouth | solo | looking_at_viewer | black_thighhighs | breastplate | blush | choker | :d | simple_background | white_background | detached_sleeves | bare_shoulders | obi | smile | sideboob | white_kimono | wide_sleeves | ponytail | hair_ornament | medium_breasts | hair_bow | pink_eyes | closed_mouth | black_choker | upper_body | hair_flower | black_dress | long_sleeves | backless_outfit | black_rose | hetero | penis | solo_focus | collarbone | nipples | sweat | 1boy | cum_in_pussy | sex | thighhighs | girl_on_top | heart-shaped_pupils | mosaic_censoring | navel | nude | tongue_out | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------|:-------------|:-------|:--------------------|:-------------------|:--------------|:--------|:---------|:-----|:--------------------|:-------------------|:-------------------|:-----------------|:------|:--------|:-----------|:---------------|:---------------|:-----------|:----------------|:-----------------|:-----------|:------------|:---------------|:---------------|:-------------|:--------------|:--------------|:---------------|:------------------|:-------------|:---------|:--------|:-------------|:-------------|:----------|:--------|:-------|:---------------|:------|:-------------|:--------------|:----------------------|:-------------------|:--------|:-------|:-------------|:----------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | | X | X | X | X | | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 18 |  |  |  |  |  | X | X | X | | X | X | | | | | | X | X | | | | | | | | X | | X | | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | | X | | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/yuisis_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T16:30:59+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:19:26+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of yuisis/ユイシス (Granblue Fantasy)
=========================================
This is the dataset of yuisis/ユイシス (Granblue Fantasy), containing 217 images and their tags.
The core tags of this character are 'long\_hair, animal\_ears, blonde\_hair, bangs, braid, breasts, very\_long\_hair, large\_breasts, red\_eyes, purple\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
992968d682ca442908f6c83382075fc6c609173a |
# Dataset of lunaru/ルナール (Granblue Fantasy)
This is the dataset of lunaru/ルナール (Granblue Fantasy), containing 56 images and their tags.
The core tags of this character are `long_hair, pointy_ears, eyepatch, black_hair, hat, bangs, blunt_bangs, medical_eyepatch, blue_eyes, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 56 | 47.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunaru_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 56 | 32.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunaru_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 120 | 68.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunaru_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 56 | 44.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunaru_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 120 | 87.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunaru_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lunaru_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, long_sleeves, solo, harvin, looking_at_viewer, blush, black_dress, black_headwear, frills, holding, gothic_lolita, white_shirt, closed_mouth, purple_hair, belt, brooch, striped_bow, white_background, wide_sleeves, black_footwear, puffy_sleeves, shoes, simple_background, smile, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | solo | harvin | looking_at_viewer | blush | black_dress | black_headwear | frills | holding | gothic_lolita | white_shirt | closed_mouth | purple_hair | belt | brooch | striped_bow | white_background | wide_sleeves | black_footwear | puffy_sleeves | shoes | simple_background | smile | very_long_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:---------|:--------------------|:--------|:--------------|:-----------------|:---------|:----------|:----------------|:--------------|:---------------|:--------------|:-------|:---------|:--------------|:-------------------|:---------------|:-----------------|:----------------|:--------|:--------------------|:--------|:-----------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/lunaru_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T16:31:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T16:44:59+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of lunaru/ルナール (Granblue Fantasy)
=========================================
This is the dataset of lunaru/ルナール (Granblue Fantasy), containing 56 images and their tags.
The core tags of this character are 'long\_hair, pointy\_ears, eyepatch, black\_hair, hat, bangs, blunt\_bangs, medical\_eyepatch, blue\_eyes, purple\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
6b8408dcb6f10b6027729f1a39667500cb8e3d54 | This is a condensed version of https://huggingface.co/datasets/G-reen/TheatreLM-v1.0-chats (check that for more details). Here, the character system prompts are replaced with a summarized version. | G-reen/TheatreLM-v1.0-chats-condensed | [
"region:us"
] | 2024-01-21T16:32:43+00:00 | {} | 2024-01-24T15:19:33+00:00 | [] | [] | TAGS
#region-us
| This is a condensed version of URL (check that for more details). Here, the character system prompts are replaced with a summarized version. | [] | [
"TAGS\n#region-us \n"
] |
098185d50b0521afd58589da75d7350d900d10a0 |
# Dataset Card for Evaluation run of Undi95/Borealis-10.7B-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Undi95/Borealis-10.7B-DPO](https://huggingface.co/Undi95/Borealis-10.7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Borealis-10.7B-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T16:33:27.338809](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Borealis-10.7B-DPO/blob/main/results_2024-01-21T16-33-27.338809.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6044091937403931,
"acc_stderr": 0.0331371595456223,
"acc_norm": 0.6103892222446324,
"acc_norm_stderr": 0.03381695313807111,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882452,
"mc2": 0.463684880718845,
"mc2_stderr": 0.014642308873505889
},
"harness|arc:challenge|25": {
"acc": 0.5477815699658704,
"acc_stderr": 0.014544519880633823,
"acc_norm": 0.5793515358361775,
"acc_norm_stderr": 0.014426211252508403
},
"harness|hellaswag|10": {
"acc": 0.6115315674168492,
"acc_stderr": 0.004864058877626277,
"acc_norm": 0.8120892252539335,
"acc_norm_stderr": 0.0038984254375815292
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.029514703583981765,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.029514703583981765
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7290322580645161,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.7290322580645161,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187215,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187215
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.017658710594443128,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.017658710594443128
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652247,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652247
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.015046301846691819,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.015046301846691819
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.02536116874968822,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.02536116874968822
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.394413407821229,
"acc_stderr": 0.01634538676210397,
"acc_norm": 0.394413407821229,
"acc_norm_stderr": 0.01634538676210397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145894,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.025976566010862748,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.025976566010862748
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.02971928127223684,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.02971928127223684
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41916558018252936,
"acc_stderr": 0.01260224450578823,
"acc_norm": 0.41916558018252936,
"acc_norm_stderr": 0.01260224450578823
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6062091503267973,
"acc_stderr": 0.01976621199107306,
"acc_norm": 0.6062091503267973,
"acc_norm_stderr": 0.01976621199107306
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882452,
"mc2": 0.463684880718845,
"mc2_stderr": 0.014642308873505889
},
"harness|winogrande|5": {
"acc": 0.7545382794001578,
"acc_stderr": 0.012095272937183646
},
"harness|gsm8k|5": {
"acc": 0.33358605003790753,
"acc_stderr": 0.012987282131410812
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Undi95__Borealis-10.7B-DPO | [
"region:us"
] | 2024-01-21T16:35:42+00:00 | {"pretty_name": "Evaluation run of Undi95/Borealis-10.7B-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [Undi95/Borealis-10.7B-DPO](https://huggingface.co/Undi95/Borealis-10.7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Borealis-10.7B-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T16:33:27.338809](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Borealis-10.7B-DPO/blob/main/results_2024-01-21T16-33-27.338809.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6044091937403931,\n \"acc_stderr\": 0.0331371595456223,\n \"acc_norm\": 0.6103892222446324,\n \"acc_norm_stderr\": 0.03381695313807111,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882452,\n \"mc2\": 0.463684880718845,\n \"mc2_stderr\": 0.014642308873505889\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.014544519880633823,\n \"acc_norm\": 0.5793515358361775,\n \"acc_norm_stderr\": 0.014426211252508403\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6115315674168492,\n \"acc_stderr\": 0.004864058877626277,\n \"acc_norm\": 0.8120892252539335,\n \"acc_norm_stderr\": 0.0038984254375815292\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.029514703583981765,\n \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.029514703583981765\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7290322580645161,\n \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.7290322580645161,\n \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187215,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187215\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443128,\n \"acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443128\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.025598193686652247,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.025598193686652247\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n \"acc_stderr\": 0.015046301846691819,\n \"acc_norm\": 0.7701149425287356,\n \"acc_norm_stderr\": 0.015046301846691819\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.02536116874968822,\n \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.02536116874968822\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.394413407821229,\n \"acc_stderr\": 0.01634538676210397,\n \"acc_norm\": 0.394413407821229,\n \"acc_norm_stderr\": 0.01634538676210397\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145894,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145894\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.025976566010862748,\n \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.025976566010862748\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223684,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223684\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41916558018252936,\n \"acc_stderr\": 0.01260224450578823,\n \"acc_norm\": 0.41916558018252936,\n \"acc_norm_stderr\": 0.01260224450578823\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6062091503267973,\n \"acc_stderr\": 0.01976621199107306,\n \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.01976621199107306\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882452,\n \"mc2\": 0.463684880718845,\n \"mc2_stderr\": 0.014642308873505889\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7545382794001578,\n \"acc_stderr\": 0.012095272937183646\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33358605003790753,\n \"acc_stderr\": 0.012987282131410812\n }\n}\n```", "repo_url": "https://huggingface.co/Undi95/Borealis-10.7B-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|arc:challenge|25_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|gsm8k|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hellaswag|10_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T16-33-27.338809.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["**/details_harness|winogrande|5_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T16-33-27.338809.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T16_33_27.338809", "path": ["results_2024-01-21T16-33-27.338809.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T16-33-27.338809.parquet"]}]}]} | 2024-01-21T16:36:03+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Undi95/Borealis-10.7B-DPO
Dataset automatically created during the evaluation run of model Undi95/Borealis-10.7B-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T16:33:27.338809(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Undi95/Borealis-10.7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model Undi95/Borealis-10.7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T16:33:27.338809(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Undi95/Borealis-10.7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model Undi95/Borealis-10.7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T16:33:27.338809(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
adc7abcc46350018e482194aba6ab0a3dbb1eedf |
# Dataset of skasaha/スカーサハ (Granblue Fantasy)
This is the dataset of skasaha/スカーサハ (Granblue Fantasy), containing 145 images and their tags.
The core tags of this character are `long_hair, animal_ears, red_eyes, ears_through_headwear, cat_ears, white_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 145 | 163.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/skasaha_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 145 | 102.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/skasaha_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 329 | 214.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/skasaha_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 145 | 149.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/skasaha_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 329 | 290.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/skasaha_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/skasaha_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 72 |  |  |  |  |  | 1girl, solo, hooded_cloak, erune, looking_at_viewer, simple_background, blush, white_background, smile |
| 1 | 7 |  |  |  |  |  | 1girl, blush, erune, smile, solo, bare_shoulders, closed_mouth, forehead, looking_at_viewer, parted_bangs, head_scarf, skirt, white_apron, detached_sleeves, food_on_face, hair_intakes, puffy_short_sleeves, small_breasts, very_long_hair, chocolate_bar, collarbone, frilled_apron, hairclip, heart, holding_bowl, holding_food, mixing_bowl, sleeveless_shirt, white_background, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | hooded_cloak | erune | looking_at_viewer | simple_background | blush | white_background | smile | bare_shoulders | closed_mouth | forehead | parted_bangs | head_scarf | skirt | white_apron | detached_sleeves | food_on_face | hair_intakes | puffy_short_sleeves | small_breasts | very_long_hair | chocolate_bar | collarbone | frilled_apron | hairclip | heart | holding_bowl | holding_food | mixing_bowl | sleeveless_shirt | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------|:--------------------|:--------------------|:--------|:-------------------|:--------|:-----------------|:---------------|:-----------|:---------------|:-------------|:--------|:--------------|:-------------------|:---------------|:---------------|:----------------------|:----------------|:-----------------|:----------------|:-------------|:----------------|:-----------|:--------|:---------------|:---------------|:--------------|:-------------------|:--------------|
| 0 | 72 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/skasaha_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T16:47:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-22T04:10:20+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of skasaha/スカーサハ (Granblue Fantasy)
===========================================
This is the dataset of skasaha/スカーサハ (Granblue Fantasy), containing 145 images and their tags.
The core tags of this character are 'long\_hair, animal\_ears, red\_eyes, ears\_through\_headwear, cat\_ears, white\_hair, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
732b436b9936a89105fff3549afbf735c822cd99 |
# Dataset of hallessena/ハレゼナ (Granblue Fantasy)
This is the dataset of hallessena/ハレゼナ (Granblue Fantasy), containing 75 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, blue_eyes, horns, pointy_ears, large_breasts, twintails, hair_ornament, hat, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 75 | 103.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hallessena_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 75 | 58.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hallessena_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 178 | 126.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hallessena_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 75 | 91.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hallessena_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 178 | 178.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hallessena_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hallessena_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bangs, draph, looking_at_viewer, navel, solo, bare_shoulders, black_necktie, blue_skirt, blush, plaid_skirt, black_headwear, black_jacket, cleavage, grin, peaked_cap, white_shirt, wings, clothing_cutout, midriff, miniskirt, pleated_skirt, sleeveless_shirt |
| 1 | 28 |  |  |  |  |  | 1girl, draph, solo, plaid_skirt, looking_at_viewer, blush, low_twintails, grin, hairclip, red_skirt, hair_ribbon, white_thighhighs, white_background |
| 2 | 9 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, blush, cum, draph, penis, censored, nipples, open_mouth, sex, low_twintails, nude, smile, sweat, tears |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bangs | draph | looking_at_viewer | navel | solo | bare_shoulders | black_necktie | blue_skirt | blush | plaid_skirt | black_headwear | black_jacket | cleavage | grin | peaked_cap | white_shirt | wings | clothing_cutout | midriff | miniskirt | pleated_skirt | sleeveless_shirt | low_twintails | hairclip | red_skirt | hair_ribbon | white_thighhighs | white_background | 1boy | hetero | solo_focus | cum | penis | censored | nipples | open_mouth | sex | nude | smile | sweat | tears |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:--------------------|:--------|:-------|:-----------------|:----------------|:-------------|:--------|:--------------|:-----------------|:---------------|:-----------|:-------|:-------------|:--------------|:--------|:------------------|:----------|:------------|:----------------|:-------------------|:----------------|:-----------|:------------|:--------------|:-------------------|:-------------------|:-------|:---------|:-------------|:------|:--------|:-----------|:----------|:-------------|:------|:-------|:--------|:--------|:--------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 1 | 28 |  |  |  |  |  | X | | X | X | | X | | | | X | X | | | | X | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | X | | | | | | | X | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hallessena_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T16:47:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:06:56+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of hallessena/ハレゼナ (Granblue Fantasy)
=============================================
This is the dataset of hallessena/ハレゼナ (Granblue Fantasy), containing 75 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, breasts, blue\_eyes, horns, pointy\_ears, large\_breasts, twintails, hair\_ornament, hat, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a5778d13b8607a0ed8091e96c3e66eaf0abd47a8 |
# Dataset of pholia/フォリア (Granblue Fantasy)
This is the dataset of pholia/フォリア (Granblue Fantasy), containing 30 images and their tags.
The core tags of this character are `beret, hat, long_hair, bangs, braid, heterochromia, thick_eyebrows, black_headwear, short_eyebrows, blunt_bangs, green_eyes, very_long_hair, grey_hair, multiple_braids`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 47.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pholia_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 25.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pholia_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 71 | 54.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pholia_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 41.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pholia_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 71 | 81.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pholia_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pholia_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------|
| 0 | 30 |  |  |  |  |  | 1girl, erune, detached_sleeves, solo, looking_at_viewer, smile, bare_shoulders, open_mouth, blush, wide_sleeves, black_thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | erune | detached_sleeves | solo | looking_at_viewer | smile | bare_shoulders | open_mouth | blush | wide_sleeves | black_thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------------------|:-------|:--------------------|:--------|:-----------------|:-------------|:--------|:---------------|:-------------------|
| 0 | 30 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/pholia_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T16:47:26+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T16:55:18+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of pholia/フォリア (Granblue Fantasy)
=========================================
This is the dataset of pholia/フォリア (Granblue Fantasy), containing 30 images and their tags.
The core tags of this character are 'beret, hat, long\_hair, bangs, braid, heterochromia, thick\_eyebrows, black\_headwear, short\_eyebrows, blunt\_bangs, green\_eyes, very\_long\_hair, grey\_hair, multiple\_braids', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a44b5ce2dd8e6c988b19e0dc4b64861dc02d7840 |
# Dataset of orchis/オーキス (Granblue Fantasy)
This is the dataset of orchis/オーキス (Granblue Fantasy), containing 92 images and their tags.
The core tags of this character are `long_hair, twintails, hat, red_eyes, hair_between_eyes, mini_hat, very_long_hair, black_headwear, bangs, top_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 92 | 130.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orchis_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 92 | 74.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orchis_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 202 | 146.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orchis_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 92 | 114.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orchis_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 202 | 208.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orchis_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/orchis_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, bare_shoulders, elbow_gloves, black_dress, black_gloves, frilled_dress, doll_joints, blue_hair, looking_at_viewer, strapless_dress, collarbone, knee_boots, necklace, solo, black_footwear, closed_mouth |
| 1 | 8 |  |  |  |  |  | 1girl, doll_joints, dress, looking_at_viewer, solo, choker, elbow_gloves, bare_shoulders, boots, umbrella, flower, stuffed_animal, thighhighs, fingerless_gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | elbow_gloves | black_dress | black_gloves | frilled_dress | doll_joints | blue_hair | looking_at_viewer | strapless_dress | collarbone | knee_boots | necklace | solo | black_footwear | closed_mouth | dress | choker | boots | umbrella | flower | stuffed_animal | thighhighs | fingerless_gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:--------------|:---------------|:----------------|:--------------|:------------|:--------------------|:------------------|:-------------|:-------------|:-----------|:-------|:-----------------|:---------------|:--------|:---------|:--------|:-----------|:---------|:-----------------|:-------------|:--------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | | | | X | | X | | | | | X | | | X | X | X | X | X | X | X | X |
| CyberHarem/orchis_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T16:47:53+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:08:00+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of orchis/オーキス (Granblue Fantasy)
=========================================
This is the dataset of orchis/オーキス (Granblue Fantasy), containing 92 images and their tags.
The core tags of this character are 'long\_hair, twintails, hat, red\_eyes, hair\_between\_eyes, mini\_hat, very\_long\_hair, black\_headwear, bangs, top\_hat', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a218074dac8df2d26a4e96c99d3c7ebc9c12b2eb |
# Dataset of farah/ファラ (Granblue Fantasy)
This is the dataset of farah/ファラ (Granblue Fantasy), containing 67 images and their tags.
The core tags of this character are `short_hair, breasts, blue_eyes, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 67 | 56.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/farah_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 67 | 41.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/farah_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 132 | 76.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/farah_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 67 | 52.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/farah_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 132 | 93.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/farah_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/farah_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, solo, sword, looking_at_viewer, open_mouth, gauntlets, simple_background, blush, white_background, belt, black_pantyhose, breastplate, holding, shield, skirt |
| 1 | 10 |  |  |  |  |  | 1girl, open_mouth, solo, belt, bag, bike_shorts, shield, sword, :d, bare_shoulders, grey_eyes, knee_pads, looking_at_viewer, necklace, outdoors, sheath |
| 2 | 7 |  |  |  |  |  | 1girl, blush, nipples, open_mouth, 1boy, hetero, nude, small_breasts, solo_focus, navel, penis, censored, cum_in_pussy, purple_eyes, sex, tears |
| 3 | 5 |  |  |  |  |  | cleavage, navel, open_mouth, white_bikini, 1girl, smile, solo, jacket, looking_at_viewer, medium_breasts, ass_visible_through_thighs, bangs, bare_shoulders, blush, collarbone, day, front-tie_top, large_breasts, ocean, off_shoulder, open_clothes, purple_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | sword | looking_at_viewer | open_mouth | gauntlets | simple_background | blush | white_background | belt | black_pantyhose | breastplate | holding | shield | skirt | bag | bike_shorts | :d | bare_shoulders | grey_eyes | knee_pads | necklace | outdoors | sheath | nipples | 1boy | hetero | nude | small_breasts | solo_focus | navel | penis | censored | cum_in_pussy | purple_eyes | sex | tears | cleavage | white_bikini | smile | jacket | medium_breasts | ass_visible_through_thighs | bangs | collarbone | day | front-tie_top | large_breasts | ocean | off_shoulder | open_clothes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:-------------|:------------|:--------------------|:--------|:-------------------|:-------|:------------------|:--------------|:----------|:---------|:--------|:------|:--------------|:-----|:-----------------|:------------|:------------|:-----------|:-----------|:---------|:----------|:-------|:---------|:-------|:----------------|:-------------|:--------|:--------|:-----------|:---------------|:--------------|:------|:--------|:-----------|:---------------|:--------|:---------|:-----------------|:-----------------------------|:--------|:-------------|:------|:----------------|:----------------|:--------|:---------------|:---------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | | | | | X | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/farah_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:02:14+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:13:44+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of farah/ファラ (Granblue Fantasy)
=======================================
This is the dataset of farah/ファラ (Granblue Fantasy), containing 67 images and their tags.
The core tags of this character are 'short\_hair, breasts, blue\_eyes, white\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
b69e9ecd75e11c1f22f14510506c10bb07a92a73 |
# Dataset of jessica/ジェシカ (Granblue Fantasy)
This is the dataset of jessica/ジェシカ (Granblue Fantasy), containing 97 images and their tags.
The core tags of this character are `long_hair, black_hair, goggles_on_head, breasts, black_eyes, large_breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 97 | 85.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessica_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 97 | 61.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessica_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 192 | 112.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessica_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 97 | 80.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessica_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 192 | 138.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessica_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jessica_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_thighhighs, goggles, solo, white_gloves, looking_at_viewer, smile, sitting, cleavage, weapon, blush |
| 1 | 5 |  |  |  |  |  | 1girl, black_thighhighs, china_dress, cleavage_cutout, goggles, looking_at_viewer, smile, solo, white_gloves, blush, side_slit, ass, bare_shoulders, breast_hold |
| 2 | 25 |  |  |  |  |  | 1girl, goggles, solo, cleavage, looking_at_viewer, smile, white_gloves, animal_ears, blush, black_bikini, frilled_bikini, navel, simple_background, medium_breasts, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_thighhighs | goggles | solo | white_gloves | looking_at_viewer | smile | sitting | cleavage | weapon | blush | china_dress | cleavage_cutout | side_slit | ass | bare_shoulders | breast_hold | animal_ears | black_bikini | frilled_bikini | navel | simple_background | medium_breasts | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:----------|:-------|:---------------|:--------------------|:--------|:----------|:-----------|:---------|:--------|:--------------|:------------------|:------------|:------|:-----------------|:--------------|:--------------|:---------------|:-----------------|:--------|:--------------------|:-----------------|:-------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | | X | X | X | X | X | X | X | | | | | | | |
| 2 | 25 |  |  |  |  |  | X | | X | X | X | X | X | | X | | X | | | | | | | X | X | X | X | X | X | X |
| CyberHarem/jessica_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:02:26+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:21:02+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of jessica/ジェシカ (Granblue Fantasy)
==========================================
This is the dataset of jessica/ジェシカ (Granblue Fantasy), containing 97 images and their tags.
The core tags of this character are 'long\_hair, black\_hair, goggles\_on\_head, breasts, black\_eyes, large\_breasts, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
aa7faadb4fdb0ad363b005254d7cbe3e8d5fca6b |
# Dataset of sophia/ソフィア (Granblue Fantasy)
This is the dataset of sophia/ソフィア (Granblue Fantasy), containing 104 images and their tags.
The core tags of this character are `blue_hair, twintails, hat, breasts, long_hair, large_breasts, yellow_eyes, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 104 | 107.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sophia_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 104 | 74.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sophia_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 223 | 140.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sophia_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 104 | 99.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sophia_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 223 | 179.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sophia_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sophia_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 34 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, fingerless_gloves, open_mouth, staff, smile, white_background, simple_background, earrings, long_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | fingerless_gloves | open_mouth | staff | smile | white_background | simple_background | earrings | long_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:--------------------|:-------------|:--------|:--------|:-------------------|:--------------------|:-----------|:---------------|
| 0 | 34 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/sophia_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:02:26+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:22:01+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of sophia/ソフィア (Granblue Fantasy)
=========================================
This is the dataset of sophia/ソフィア (Granblue Fantasy), containing 104 images and their tags.
The core tags of this character are 'blue\_hair, twintails, hat, breasts, long\_hair, large\_breasts, yellow\_eyes, brown\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a7cac734ca1a41e7de0972552bfeabefb032dcf9 |
# Dataset of diantha/ディアンサ (Granblue Fantasy)
This is the dataset of diantha/ディアンサ (Granblue Fantasy), containing 60 images and their tags.
The core tags of this character are `brown_hair, long_hair, breasts, side_ponytail, brown_eyes, hair_ornament, medium_breasts, ahoge, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 60 | 74.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diantha_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 60 | 49.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diantha_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 139 | 100.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diantha_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 60 | 69.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diantha_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 139 | 128.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diantha_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/diantha_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, simple_background, smile, solo, open_mouth, bracelet, bangs, short_sleeves, white_background, blush, boots, full_body, idol, holding_microphone, short_dress |
| 1 | 11 |  |  |  |  |  | bikini, hair_flower, looking_at_viewer, smile, 1girl, cleavage, navel, bracelet, open_mouth, solo, skirt, blush, choker, large_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | simple_background | smile | solo | open_mouth | bracelet | bangs | short_sleeves | white_background | blush | boots | full_body | idol | holding_microphone | short_dress | bikini | hair_flower | navel | skirt | choker | large_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:--------------------|:--------|:-------|:-------------|:-----------|:--------|:----------------|:-------------------|:--------|:--------|:------------|:-------|:---------------------|:--------------|:---------|:--------------|:--------|:--------|:---------|:----------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | | X | X | X | X | | | | X | | | | | | X | X | X | X | X | X |
| CyberHarem/diantha_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:02:33+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:15:46+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of diantha/ディアンサ (Granblue Fantasy)
===========================================
This is the dataset of diantha/ディアンサ (Granblue Fantasy), containing 60 images and their tags.
The core tags of this character are 'brown\_hair, long\_hair, breasts, side\_ponytail, brown\_eyes, hair\_ornament, medium\_breasts, ahoge, bow', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
feb3a878db0fdc93cadf739f3a2749be4d4698c8 |
# PanLex
January 1, 2024 version of [PanLex Language Vocabulary](https://vocab.panlex.org/) with 24,650,274 rows covering 6,152 languages.
# Columns
- `vocab`: contains the text entry.
- `639-3`: contains the ISO 639-3 languages tags to allow users to filter on the language(s) of their choice.
- `639-3_english_name`: the English language name associated to the code ISO 639-3.
- `var_code`: contains a code to differentiate language variants. In practice, this is the code 639-3 + a number. If 000, it corresponds to the classic 639-3, if the number is different, it corresponds to a variant (e.g. `fra-006` corresponds to Québécois according to [https://vocab.panlex.org/fra-006](https://vocab.panlex.org/fra-006)). This is a code created by Panlex, and therefore does not follow an international standard.
- `english_name_var`: the English language name associated to the `var_code`.
# Source
Readers are invited to consult the [https://panlex.org/source-list/](https://panlex.org/source-list/) page, which summarizes all the sources.
# Citation
> Kamholz, David, Jonathan Pool, and Susan M. Colowick. 2014.
PanLex: Building a Resource for Panlingual Lexical Translation.
Proceedings of the 9th International Conference on Language Resources and Evaluation (LREC 2014).
http://www.lrec-conf.org/proceedings/lrec2014/pdf/1029_Paper.pdf | lbourdois/panlex | [
"size_categories:10M<n<100M",
"language:aaa",
"language:aab",
"language:aac",
"language:aad",
"language:aae",
"language:aag",
"language:aai",
"language:aak",
"language:aal",
"language:aan",
"language:aao",
"language:aap",
"language:aaq",
"language:aar",
"language:aas",
"language:aat",
"language:aau",
"language:aaw",
"language:aax",
"language:aba",
"language:abb",
"language:abe",
"language:abf",
"language:abg",
"language:abi",
"language:abj",
"language:abk",
"language:abl",
"language:abm",
"language:abn",
"language:abo",
"language:abq",
"language:abr",
"language:abs",
"language:abt",
"language:abu",
"language:abv",
"language:abw",
"language:abx",
"language:aby",
"language:abz",
"language:aca",
"language:acd",
"language:ace",
"language:acf",
"language:ach",
"language:aci",
"language:ack",
"language:acl",
"language:acm",
"language:acn",
"language:acp",
"language:acq",
"language:acr",
"language:acs",
"language:act",
"language:acu",
"language:acv",
"language:acw",
"language:acx",
"language:acy",
"language:ada",
"language:add",
"language:ade",
"language:adg",
"language:adh",
"language:adi",
"language:adj",
"language:adl",
"language:adn",
"language:ado",
"language:adq",
"language:adr",
"language:adt",
"language:adw",
"language:adx",
"language:ady",
"language:adz",
"language:aeb",
"language:aec",
"language:aee",
"language:ael",
"language:aem",
"language:aer",
"language:aes",
"language:aew",
"language:aey",
"language:aez",
"language:afb",
"language:afd",
"language:afe",
"language:afi",
"language:afk",
"language:afn",
"language:afo",
"language:afp",
"language:afr",
"language:aft",
"language:afu",
"language:afz",
"language:agb",
"language:agc",
"language:agd",
"language:age",
"language:agf",
"language:agg",
"language:agh",
"language:agj",
"language:agl",
"language:agm",
"language:agn",
"language:ago",
"language:agq",
"language:agr",
"language:ags",
"language:agt",
"language:agu",
"language:agv",
"language:agw",
"language:agx",
"language:aha",
"language:ahb",
"language:ahg",
"language:ahh",
"language:ahi",
"language:ahk",
"language:ahl",
"language:aho",
"language:ahr",
"language:ahs",
"language:aht",
"language:aia",
"language:aib",
"language:aid",
"language:aie",
"language:aif",
"language:aih",
"language:aii",
"language:aik",
"language:ail",
"language:aim",
"language:ain",
"language:aio",
"language:air",
"language:ait",
"language:aiw",
"language:aix",
"language:aja",
"language:ajg",
"language:aji",
"language:ajp",
"language:ajz",
"language:aka",
"language:akb",
"language:akc",
"language:akd",
"language:ake",
"language:akf",
"language:akg",
"language:akh",
"language:aki",
"language:akj",
"language:akk",
"language:akl",
"language:akm",
"language:ako",
"language:akp",
"language:akq",
"language:akr",
"language:aks",
"language:aku",
"language:akv",
"language:akw",
"language:akx",
"language:aky",
"language:akz",
"language:ala",
"language:alc",
"language:ald",
"language:ale",
"language:alf",
"language:alh",
"language:ali",
"language:alj",
"language:alk",
"language:all",
"language:alm",
"language:aln",
"language:alo",
"language:alp",
"language:alq",
"language:alr",
"language:als",
"language:alt",
"language:alu",
"language:alw",
"language:alx",
"language:aly",
"language:alz",
"language:ama",
"language:amc",
"language:ame",
"language:amf",
"language:amg",
"language:amh",
"language:ami",
"language:amj",
"language:amk",
"language:aml",
"language:amm",
"language:amn",
"language:amo",
"language:amp",
"language:amq",
"language:amr",
"language:ams",
"language:amt",
"language:amu",
"language:amv",
"language:amw",
"language:amx",
"language:amy",
"language:amz",
"language:ana",
"language:anb",
"language:anc",
"language:and",
"language:ane",
"language:anf",
"language:ang",
"language:anh",
"language:ani",
"language:anj",
"language:ank",
"language:anm",
"language:ann",
"language:ano",
"language:anp",
"language:anq",
"language:ant",
"language:anu",
"language:anv",
"language:anw",
"language:anx",
"language:any",
"language:anz",
"language:aoa",
"language:aoc",
"language:aod",
"language:aof",
"language:aog",
"language:aoi",
"language:aoj",
"language:aol",
"language:aom",
"language:aon",
"language:aor",
"language:aos",
"language:aot",
"language:aou",
"language:aox",
"language:aoz",
"language:apb",
"language:apc",
"language:apd",
"language:ape",
"language:aph",
"language:api",
"language:apj",
"language:apk",
"language:apl",
"language:apm",
"language:apn",
"language:apo",
"language:app",
"language:apq",
"language:apr",
"language:apt",
"language:apu",
"language:apw",
"language:apx",
"language:apy",
"language:apz",
"language:aqc",
"language:aqd",
"language:aqg",
"language:aqm",
"language:aqp",
"language:aqt",
"language:aqz",
"language:ara",
"language:arb",
"language:arc",
"language:ard",
"language:are",
"language:arg",
"language:arh",
"language:ari",
"language:ark",
"language:arl",
"language:arn",
"language:aro",
"language:arp",
"language:arq",
"language:arr",
"language:ars",
"language:arv",
"language:arw",
"language:arx",
"language:ary",
"language:arz",
"language:asa",
"language:asb",
"language:asc",
"language:ase",
"language:asg",
"language:ash",
"language:asi",
"language:asj",
"language:ask",
"language:asl",
"language:asm",
"language:asn",
"language:aso",
"language:asr",
"language:ass",
"language:ast",
"language:asu",
"language:asv",
"language:asx",
"language:asy",
"language:asz",
"language:ata",
"language:atb",
"language:atc",
"language:atd",
"language:ate",
"language:ati",
"language:atj",
"language:atk",
"language:atm",
"language:ato",
"language:atq",
"language:atr",
"language:ats",
"language:att",
"language:atv",
"language:atw",
"language:atx",
"language:aty",
"language:atz",
"language:aua",
"language:auc",
"language:aud",
"language:auh",
"language:aui",
"language:auj",
"language:auk",
"language:aul",
"language:aum",
"language:aun",
"language:aup",
"language:aut",
"language:auu",
"language:auw",
"language:auy",
"language:auz",
"language:ava",
"language:avb",
"language:avd",
"language:ave",
"language:avi",
"language:avk",
"language:avm",
"language:avn",
"language:avt",
"language:avu",
"language:avv",
"language:awa",
"language:awb",
"language:awc",
"language:awe",
"language:awg",
"language:awh",
"language:awi",
"language:awk",
"language:awm",
"language:awn",
"language:awo",
"language:awr",
"language:aws",
"language:awt",
"language:awv",
"language:awx",
"language:awy",
"language:axb",
"language:axe",
"language:axg",
"language:axk",
"language:axl",
"language:axm",
"language:axx",
"language:aya",
"language:ayb",
"language:ayc",
"language:ayd",
"language:aye",
"language:ayg",
"language:ayh",
"language:ayi",
"language:ayl",
"language:aym",
"language:ayn",
"language:ayo",
"language:ayr",
"language:ayu",
"language:ayz",
"language:aza",
"language:azb",
"language:azd",
"language:azg",
"language:azj",
"language:azm",
"language:azn",
"language:azo",
"language:azz",
"language:baa",
"language:bab",
"language:bae",
"language:baf",
"language:bag",
"language:baj",
"language:bak",
"language:bal",
"language:bam",
"language:ban",
"language:bao",
"language:bap",
"language:bar",
"language:bas",
"language:bau",
"language:bav",
"language:baw",
"language:bax",
"language:bba",
"language:bbb",
"language:bbc",
"language:bbd",
"language:bbe",
"language:bbf",
"language:bbh",
"language:bbi",
"language:bbj",
"language:bbk",
"language:bbl",
"language:bbn",
"language:bbp",
"language:bbq",
"language:bbr",
"language:bbs",
"language:bbt",
"language:bbu",
"language:bbv",
"language:bbw",
"language:bby",
"language:bca",
"language:bcc",
"language:bcd",
"language:bce",
"language:bcf",
"language:bcg",
"language:bch",
"language:bci",
"language:bcj",
"language:bck",
"language:bcl",
"language:bcm",
"language:bcn",
"language:bco",
"language:bcp",
"language:bcq",
"language:bcr",
"language:bcs",
"language:bcu",
"language:bcv",
"language:bcw",
"language:bcy",
"language:bcz",
"language:bda",
"language:bdb",
"language:bdc",
"language:bdd",
"language:bde",
"language:bdg",
"language:bdh",
"language:bdi",
"language:bdj",
"language:bdk",
"language:bdl",
"language:bdm",
"language:bdn",
"language:bdp",
"language:bdq",
"language:bdr",
"language:bds",
"language:bdu",
"language:bdw",
"language:bdx",
"language:bdy",
"language:bea",
"language:beb",
"language:bec",
"language:bed",
"language:bee",
"language:bef",
"language:beg",
"language:beh",
"language:bei",
"language:bej",
"language:bek",
"language:bel",
"language:bem",
"language:ben",
"language:beo",
"language:bep",
"language:beq",
"language:bet",
"language:beu",
"language:bev",
"language:bew",
"language:bex",
"language:bey",
"language:bez",
"language:bfa",
"language:bfc",
"language:bfd",
"language:bff",
"language:bfg",
"language:bfh",
"language:bfj",
"language:bfl",
"language:bfm",
"language:bfn",
"language:bfp",
"language:bfq",
"language:bfs",
"language:bft",
"language:bfu",
"language:bfw",
"language:bfy",
"language:bfz",
"language:bgc",
"language:bgd",
"language:bgf",
"language:bgg",
"language:bgi",
"language:bgj",
"language:bgk",
"language:bgl",
"language:bgn",
"language:bgo",
"language:bgp",
"language:bgq",
"language:bgr",
"language:bgs",
"language:bgt",
"language:bgu",
"language:bgv",
"language:bgz",
"language:bhb",
"language:bhc",
"language:bhd",
"language:bhf",
"language:bhg",
"language:bhh",
"language:bhj",
"language:bhl",
"language:bhn",
"language:bho",
"language:bhp",
"language:bhq",
"language:bhr",
"language:bht",
"language:bhw",
"language:bhy",
"language:bhz",
"language:bia",
"language:bib",
"language:bid",
"language:bie",
"language:bif",
"language:big",
"language:bil",
"language:bim",
"language:bin",
"language:bio",
"language:biq",
"language:bir",
"language:bis",
"language:bit",
"language:biu",
"language:biv",
"language:biw",
"language:biy",
"language:biz",
"language:bja",
"language:bjb",
"language:bjc",
"language:bje",
"language:bjf",
"language:bjg",
"language:bjh",
"language:bji",
"language:bjk",
"language:bjl",
"language:bjm",
"language:bjn",
"language:bjp",
"language:bjr",
"language:bjs",
"language:bjt",
"language:bjv",
"language:bjw",
"language:bjy",
"language:bjz",
"language:bkc",
"language:bkd",
"language:bkh",
"language:bki",
"language:bkj",
"language:bkk",
"language:bkl",
"language:bkm",
"language:bkn",
"language:bko",
"language:bkq",
"language:bkr",
"language:bks",
"language:bku",
"language:bkv",
"language:bkw",
"language:bkx",
"language:bky",
"language:bkz",
"language:bla",
"language:blb",
"language:blc",
"language:bld",
"language:ble",
"language:blf",
"language:blh",
"language:bli",
"language:blj",
"language:blk",
"language:bll",
"language:blm",
"language:bln",
"language:blp",
"language:blq",
"language:blr",
"language:bls",
"language:blt",
"language:blv",
"language:blw",
"language:blx",
"language:bly",
"language:blz",
"language:bmb",
"language:bmc",
"language:bmd",
"language:bme",
"language:bmf",
"language:bmg",
"language:bmh",
"language:bmi",
"language:bmk",
"language:bmm",
"language:bmn",
"language:bmo",
"language:bmr",
"language:bmt",
"language:bmu",
"language:bmv",
"language:bmx",
"language:bmz",
"language:bna",
"language:bnb",
"language:bnc",
"language:bnd",
"language:bne",
"language:bnf",
"language:bng",
"language:bni",
"language:bnj",
"language:bnk",
"language:bnl",
"language:bnm",
"language:bnn",
"language:bno",
"language:bnp",
"language:bnq",
"language:bnr",
"language:bns",
"language:bnv",
"language:bnw",
"language:bnx",
"language:bny",
"language:bnz",
"language:boa",
"language:bob",
"language:bod",
"language:boe",
"language:bof",
"language:boh",
"language:boi",
"language:boj",
"language:bol",
"language:bom",
"language:bon",
"language:bop",
"language:boq",
"language:bor",
"language:bos",
"language:bot",
"language:bou",
"language:bov",
"language:box",
"language:boy",
"language:boz",
"language:bpa",
"language:bpg",
"language:bph",
"language:bpi",
"language:bpj",
"language:bpk",
"language:bpm",
"language:bpn",
"language:bpp",
"language:bpq",
"language:bpr",
"language:bps",
"language:bpu",
"language:bpv",
"language:bpw",
"language:bpy",
"language:bpz",
"language:bqc",
"language:bqg",
"language:bqi",
"language:bql",
"language:bqp",
"language:bqq",
"language:bqr",
"language:bqs",
"language:bqt",
"language:bqv",
"language:bqw",
"language:bqz",
"language:brb",
"language:brc",
"language:brd",
"language:bre",
"language:brf",
"language:brg",
"language:brh",
"language:bri",
"language:brj",
"language:brk",
"language:brl",
"language:brm",
"language:brn",
"language:brp",
"language:brq",
"language:brr",
"language:brt",
"language:bru",
"language:brv",
"language:brx",
"language:brz",
"language:bsb",
"language:bsc",
"language:bse",
"language:bsf",
"language:bsg",
"language:bsh",
"language:bsi",
"language:bsk",
"language:bsl",
"language:bsm",
"language:bsn",
"language:bsp",
"language:bsq",
"language:bss",
"language:bst",
"language:bsu",
"language:bsw",
"language:bsx",
"language:bsy",
"language:bta",
"language:btc",
"language:btd",
"language:bte",
"language:btf",
"language:bth",
"language:btj",
"language:btm",
"language:btn",
"language:bto",
"language:btp",
"language:btq",
"language:btr",
"language:bts",
"language:btt",
"language:btu",
"language:btv",
"language:btw",
"language:btx",
"language:bty",
"language:btz",
"language:bua",
"language:bub",
"language:buc",
"language:bud",
"language:bue",
"language:buf",
"language:bug",
"language:buh",
"language:buk",
"language:bul",
"language:bum",
"language:bun",
"language:buo",
"language:bup",
"language:buq",
"language:bus",
"language:but",
"language:buw",
"language:bux",
"language:buy",
"language:bva",
"language:bvb",
"language:bvc",
"language:bvd",
"language:bvg",
"language:bvi",
"language:bvj",
"language:bvk",
"language:bvm",
"language:bvn",
"language:bvo",
"language:bvq",
"language:bvr",
"language:bvv",
"language:bvw",
"language:bvx",
"language:bvz",
"language:bwa",
"language:bwb",
"language:bwd",
"language:bwe",
"language:bwf",
"language:bwg",
"language:bwi",
"language:bwk",
"language:bwl",
"language:bwm",
"language:bwn",
"language:bwo",
"language:bwp",
"language:bwq",
"language:bwr",
"language:bws",
"language:bwt",
"language:bwu",
"language:bww",
"language:bwx",
"language:bxa",
"language:bxb",
"language:bxd",
"language:bxe",
"language:bxf",
"language:bxg",
"language:bxh",
"language:bxi",
"language:bxj",
"language:bxk",
"language:bxl",
"language:bxm",
"language:bxn",
"language:bxq",
"language:bxr",
"language:bxu",
"language:bxv",
"language:bxw",
"language:bxz",
"language:bya",
"language:byd",
"language:bye",
"language:byf",
"language:byi",
"language:byk",
"language:byl",
"language:bym",
"language:byn",
"language:byo",
"language:byp",
"language:byq",
"language:byr",
"language:bys",
"language:byt",
"language:byv",
"language:byw",
"language:byx",
"language:byz",
"language:bza",
"language:bzb",
"language:bzc",
"language:bzd",
"language:bze",
"language:bzf",
"language:bzg",
"language:bzh",
"language:bzi",
"language:bzj",
"language:bzk",
"language:bzl",
"language:bzn",
"language:bzp",
"language:bzq",
"language:bzr",
"language:bzt",
"language:bzu",
"language:bzv",
"language:bzw",
"language:bzx",
"language:bzy",
"language:bzz",
"language:caa",
"language:cab",
"language:cac",
"language:cad",
"language:cae",
"language:caf",
"language:cag",
"language:cah",
"language:caj",
"language:cak",
"language:cal",
"language:cam",
"language:can",
"language:cao",
"language:cap",
"language:caq",
"language:car",
"language:cas",
"language:cat",
"language:cav",
"language:caw",
"language:cax",
"language:cay",
"language:caz",
"language:cbb",
"language:cbc",
"language:cbd",
"language:cbg",
"language:cbi",
"language:cbj",
"language:cbk",
"language:cbn",
"language:cbo",
"language:cbr",
"language:cbs",
"language:cbt",
"language:cbu",
"language:cbv",
"language:cby",
"language:ccc",
"language:cce",
"language:ccg",
"language:cch",
"language:ccj",
"language:cco",
"language:ccp",
"language:ccr",
"language:cdf",
"language:cdh",
"language:cdj",
"language:cdm",
"language:cdn",
"language:cdo",
"language:cdr",
"language:cdy",
"language:cdz",
"language:cea",
"language:ceb",
"language:ceg",
"language:ces",
"language:cet",
"language:cfa",
"language:cfd",
"language:cfg",
"language:cfm",
"language:cgc",
"language:cgg",
"language:cha",
"language:chb",
"language:chc",
"language:chd",
"language:che",
"language:chf",
"language:chg",
"language:chh",
"language:chj",
"language:chk",
"language:chl",
"language:chm",
"language:chn",
"language:cho",
"language:chp",
"language:chq",
"language:chr",
"language:cht",
"language:chu",
"language:chv",
"language:chw",
"language:chx",
"language:chy",
"language:chz",
"language:cia",
"language:cic",
"language:cid",
"language:cie",
"language:cik",
"language:cim",
"language:cin",
"language:cip",
"language:cir",
"language:ciw",
"language:ciy",
"language:cja",
"language:cje",
"language:cjh",
"language:cji",
"language:cjk",
"language:cjm",
"language:cjn",
"language:cjo",
"language:cjp",
"language:cjs",
"language:cjv",
"language:cjy",
"language:ckb",
"language:ckh",
"language:ckl",
"language:cko",
"language:ckq",
"language:cks",
"language:ckt",
"language:cku",
"language:ckv",
"language:ckx",
"language:cla",
"language:clc",
"language:cld",
"language:cle",
"language:clh",
"language:cli",
"language:clk",
"language:cll",
"language:clm",
"language:clo",
"language:clw",
"language:cly",
"language:cme",
"language:cmg",
"language:cmi",
"language:cml",
"language:cmn",
"language:cmo",
"language:cnb",
"language:cng",
"language:cnh",
"language:cni",
"language:cnk",
"language:cnl",
"language:cnp",
"language:cnr",
"language:cns",
"language:cnt",
"language:cnu",
"language:cnx",
"language:cob",
"language:coc",
"language:cod",
"language:coe",
"language:cof",
"language:cog",
"language:coh",
"language:coj",
"language:cok",
"language:col",
"language:com",
"language:con",
"language:coo",
"language:cop",
"language:coq",
"language:cor",
"language:cos",
"language:cot",
"language:cou",
"language:cow",
"language:coz",
"language:cpa",
"language:cpb",
"language:cpc",
"language:cpg",
"language:cpi",
"language:cpn",
"language:cps",
"language:cpu",
"language:cpx",
"language:cqd",
"language:cra",
"language:crb",
"language:crc",
"language:crd",
"language:cre",
"language:crg",
"language:crh",
"language:cri",
"language:crj",
"language:crk",
"language:crl",
"language:crm",
"language:crn",
"language:cro",
"language:crq",
"language:crr",
"language:crs",
"language:crt",
"language:crw",
"language:crx",
"language:cry",
"language:crz",
"language:csa",
"language:csb",
"language:csh",
"language:csi",
"language:csm",
"language:cso",
"language:csp",
"language:css",
"language:cst",
"language:csw",
"language:csy",
"language:csz",
"language:cta",
"language:ctc",
"language:ctd",
"language:cte",
"language:ctg",
"language:ctl",
"language:ctm",
"language:ctn",
"language:cto",
"language:ctp",
"language:cts",
"language:ctu",
"language:ctz",
"language:cua",
"language:cub",
"language:cuc",
"language:cuh",
"language:cui",
"language:cuj",
"language:cuk",
"language:cul",
"language:cuo",
"language:cup",
"language:cuq",
"language:cur",
"language:cut",
"language:cuu",
"language:cuv",
"language:cux",
"language:cuy",
"language:cvg",
"language:cvn",
"language:cwd",
"language:cwe",
"language:cwg",
"language:cwt",
"language:cya",
"language:cyb",
"language:cym",
"language:cyo",
"language:czh",
"language:czk",
"language:czn",
"language:czo",
"language:czt",
"language:daa",
"language:dac",
"language:dad",
"language:dag",
"language:dah",
"language:dai",
"language:daj",
"language:dak",
"language:dal",
"language:dam",
"language:dan",
"language:daq",
"language:dar",
"language:dau",
"language:dav",
"language:daw",
"language:dax",
"language:dba",
"language:dbb",
"language:dbd",
"language:dbe",
"language:dbf",
"language:dbg",
"language:dbi",
"language:dbj",
"language:dbl",
"language:dbm",
"language:dbn",
"language:dbp",
"language:dbq",
"language:dbt",
"language:dbu",
"language:dbw",
"language:dby",
"language:dcc",
"language:dcr",
"language:dda",
"language:ddd",
"language:ddg",
"language:ddi",
"language:ddj",
"language:ddn",
"language:ddo",
"language:dds",
"language:ddw",
"language:dec",
"language:ded",
"language:dee",
"language:deg",
"language:dei",
"language:del",
"language:dem",
"language:den",
"language:dep",
"language:der",
"language:des",
"language:deu",
"language:dev",
"language:dga",
"language:dgb",
"language:dgc",
"language:dgd",
"language:dge",
"language:dgg",
"language:dgh",
"language:dgi",
"language:dgl",
"language:dgo",
"language:dgr",
"language:dgt",
"language:dgw",
"language:dgz",
"language:dhd",
"language:dhg",
"language:dhi",
"language:dhl",
"language:dhr",
"language:dhs",
"language:dhu",
"language:dhv",
"language:dia",
"language:dib",
"language:dic",
"language:did",
"language:dif",
"language:dig",
"language:dih",
"language:dii",
"language:dij",
"language:dik",
"language:dil",
"language:dim",
"language:din",
"language:dio",
"language:dip",
"language:diq",
"language:dir",
"language:dis",
"language:diu",
"language:div",
"language:diw",
"language:dix",
"language:diz",
"language:dja",
"language:djb",
"language:djc",
"language:djd",
"language:dje",
"language:dji",
"language:djj",
"language:djk",
"language:djm",
"language:djn",
"language:djr",
"language:dju",
"language:djw",
"language:dkr",
"language:dks",
"language:dlg",
"language:dlk",
"language:dlm",
"language:dma",
"language:dmb",
"language:dmc",
"language:dmd",
"language:dme",
"language:dmf",
"language:dmg",
"language:dmk",
"language:dml",
"language:dmm",
"language:dmo",
"language:dmr",
"language:dms",
"language:dmu",
"language:dmv",
"language:dmw",
"language:dmy",
"language:dna",
"language:dnd",
"language:dne",
"language:dng",
"language:dni",
"language:dnj",
"language:dnn",
"language:dnr",
"language:dnt",
"language:dnu",
"language:dnw",
"language:dny",
"language:doa",
"language:dob",
"language:doc",
"language:doe",
"language:dof",
"language:doh",
"language:doi",
"language:dok",
"language:dol",
"language:don",
"language:doo",
"language:dor",
"language:dos",
"language:dot",
"language:dow",
"language:dox",
"language:doy",
"language:doz",
"language:dpp",
"language:drb",
"language:drd",
"language:dre",
"language:drg",
"language:dri",
"language:drl",
"language:drn",
"language:dro",
"language:drs",
"language:dru",
"language:dry",
"language:dsb",
"language:dsh",
"language:dsn",
"language:dso",
"language:dsq",
"language:dta",
"language:dtb",
"language:dtd",
"language:dth",
"language:dti",
"language:dtk",
"language:dtm",
"language:dto",
"language:dtp",
"language:dtr",
"language:dts",
"language:dtt",
"language:dtu",
"language:dty",
"language:dua",
"language:duc",
"language:due",
"language:duf",
"language:dug",
"language:duk",
"language:dum",
"language:dun",
"language:duo",
"language:dup",
"language:duq",
"language:dur",
"language:dus",
"language:duu",
"language:duv",
"language:duw",
"language:dux",
"language:dva",
"language:dwa",
"language:dwk",
"language:dwr",
"language:dws",
"language:dwu",
"language:dww",
"language:dya",
"language:dyb",
"language:dyd",
"language:dyi",
"language:dym",
"language:dyn",
"language:dyo",
"language:dyu",
"language:dyy",
"language:dze",
"language:dzg",
"language:dzo",
"language:ebg",
"language:ebk",
"language:ebo",
"language:ebr",
"language:ebu",
"language:ecy",
"language:eee",
"language:efa",
"language:efi",
"language:ega",
"language:egl",
"language:ego",
"language:egy",
"language:ehu",
"language:eip",
"language:eit",
"language:eiv",
"language:eja",
"language:eka",
"language:eke",
"language:ekg",
"language:ekk",
"language:ekl",
"language:ekm",
"language:eko",
"language:ekp",
"language:ekr",
"language:eky",
"language:ele",
"language:eli",
"language:elk",
"language:ell",
"language:elm",
"language:elo",
"language:elx",
"language:ema",
"language:emb",
"language:eme",
"language:emg",
"language:emi",
"language:emk",
"language:emn",
"language:emp",
"language:ems",
"language:emw",
"language:emy",
"language:ena",
"language:enb",
"language:enc",
"language:end",
"language:enf",
"language:eng",
"language:enh",
"language:enl",
"language:enm",
"language:enn",
"language:eno",
"language:enq",
"language:enr",
"language:enw",
"language:enx",
"language:eot",
"language:epi",
"language:epo",
"language:erg",
"language:erh",
"language:eri",
"language:erk",
"language:ero",
"language:err",
"language:ers",
"language:ert",
"language:erw",
"language:ese",
"language:esg",
"language:esh",
"language:esi",
"language:esk",
"language:eso",
"language:esq",
"language:ess",
"language:esu",
"language:etb",
"language:etc",
"language:etn",
"language:eto",
"language:etr",
"language:ets",
"language:ett",
"language:etu",
"language:etx",
"language:etz",
"language:eus",
"language:eve",
"language:evh",
"language:evn",
"language:ewe",
"language:ewo",
"language:ext",
"language:eya",
"language:eyo",
"language:faa",
"language:fab",
"language:fad",
"language:faf",
"language:fag",
"language:fai",
"language:faj",
"language:fak",
"language:fal",
"language:fam",
"language:fan",
"language:fao",
"language:fap",
"language:far",
"language:fat",
"language:fau",
"language:fax",
"language:fay",
"language:fbl",
"language:fer",
"language:ffm",
"language:fgr",
"language:fia",
"language:fie",
"language:fij",
"language:fil",
"language:fin",
"language:fip",
"language:fir",
"language:fit",
"language:fkk",
"language:fkv",
"language:fla",
"language:flh",
"language:fli",
"language:fll",
"language:fln",
"language:flr",
"language:fly",
"language:fmp",
"language:fng",
"language:fni",
"language:fod",
"language:foi",
"language:fom",
"language:fon",
"language:for",
"language:fos",
"language:fpe",
"language:fqs",
"language:fra",
"language:frc",
"language:frd",
"language:frk",
"language:frm",
"language:fro",
"language:frp",
"language:frq",
"language:frr",
"language:frt",
"language:fry",
"language:fub",
"language:fuc",
"language:fud",
"language:fuf",
"language:fuh",
"language:fuj",
"language:ful",
"language:fun",
"language:fur",
"language:fut",
"language:fuu",
"language:fuv",
"language:fuy",
"language:fvr",
"language:fwa",
"language:gaa",
"language:gab",
"language:gac",
"language:gad",
"language:gae",
"language:gaf",
"language:gag",
"language:gah",
"language:gai",
"language:gaj",
"language:gal",
"language:gam",
"language:gan",
"language:gao",
"language:gap",
"language:gaq",
"language:gar",
"language:gat",
"language:gaw",
"language:gax",
"language:gay",
"language:gaz",
"language:gba",
"language:gbb",
"language:gbd",
"language:gbe",
"language:gbg",
"language:gbi",
"language:gbj",
"language:gbm",
"language:gbp",
"language:gbr",
"language:gbu",
"language:gbw",
"language:gbx",
"language:gby",
"language:gbz",
"language:gcc",
"language:gcd",
"language:gce",
"language:gcf",
"language:gcl",
"language:gcn",
"language:gcr",
"language:gdb",
"language:gdc",
"language:gdd",
"language:gde",
"language:gdf",
"language:gdg",
"language:gdh",
"language:gdi",
"language:gdj",
"language:gdl",
"language:gdm",
"language:gdn",
"language:gdo",
"language:gdq",
"language:gdr",
"language:gdu",
"language:gdx",
"language:gea",
"language:geb",
"language:ged",
"language:geg",
"language:geh",
"language:gei",
"language:gej",
"language:gek",
"language:gel",
"language:geq",
"language:ges",
"language:gev",
"language:gew",
"language:gey",
"language:gez",
"language:gfk",
"language:gft",
"language:gga",
"language:ggb",
"language:ggd",
"language:gge",
"language:ggk",
"language:ggl",
"language:ggt",
"language:ggu",
"language:ggw",
"language:gha",
"language:ghk",
"language:ghl",
"language:ghn",
"language:gho",
"language:ghs",
"language:ght",
"language:gia",
"language:gic",
"language:gid",
"language:gil",
"language:gim",
"language:gin",
"language:gip",
"language:giq",
"language:gir",
"language:gis",
"language:git",
"language:giw",
"language:gix",
"language:giy",
"language:giz",
"language:gjm",
"language:gjn",
"language:gju",
"language:gke",
"language:gkn",
"language:gkp",
"language:gku",
"language:gla",
"language:glc",
"language:gld",
"language:gle",
"language:glg",
"language:glh",
"language:glj",
"language:glk",
"language:glo",
"language:glv",
"language:glw",
"language:gly",
"language:gma",
"language:gmb",
"language:gmh",
"language:gml",
"language:gmm",
"language:gmu",
"language:gmv",
"language:gmy",
"language:gna",
"language:gnc",
"language:gnd",
"language:gne",
"language:gng",
"language:gnh",
"language:gni",
"language:gnk",
"language:gnl",
"language:gnm",
"language:gnn",
"language:gno",
"language:gnq",
"language:gnr",
"language:gnu",
"language:goa",
"language:gob",
"language:goc",
"language:god",
"language:gof",
"language:gog",
"language:goh",
"language:goi",
"language:gol",
"language:gom",
"language:gon",
"language:gop",
"language:gor",
"language:gos",
"language:got",
"language:gou",
"language:gow",
"language:gox",
"language:gpa",
"language:gpe",
"language:gpn",
"language:gqa",
"language:gqi",
"language:gqn",
"language:gqr",
"language:gqu",
"language:gra",
"language:grb",
"language:grc",
"language:grd",
"language:grg",
"language:grh",
"language:gri",
"language:grn",
"language:gro",
"language:grq",
"language:grr",
"language:grs",
"language:grt",
"language:gru",
"language:grx",
"language:gsl",
"language:gsn",
"language:gso",
"language:gsp",
"language:gsw",
"language:gta",
"language:gtu",
"language:gua",
"language:gub",
"language:guc",
"language:gud",
"language:gue",
"language:guf",
"language:gug",
"language:guh",
"language:gui",
"language:guj",
"language:guk",
"language:gul",
"language:gum",
"language:gun",
"language:guo",
"language:gup",
"language:guq",
"language:gur",
"language:gut",
"language:guu",
"language:gux",
"language:guz",
"language:gva",
"language:gvc",
"language:gve",
"language:gvf",
"language:gvj",
"language:gvl",
"language:gvm",
"language:gvn",
"language:gvo",
"language:gvp",
"language:gvr",
"language:gvs",
"language:gwa",
"language:gwc",
"language:gwd",
"language:gwe",
"language:gwf",
"language:gwi",
"language:gwj",
"language:gwn",
"language:gwr",
"language:gwt",
"language:gwu",
"language:gww",
"language:gwx",
"language:gxx",
"language:gya",
"language:gyb",
"language:gyd",
"language:gye",
"language:gyl",
"language:gym",
"language:gyn",
"language:gyo",
"language:gyr",
"language:gyy",
"language:gza",
"language:gzi",
"language:gzn",
"language:haa",
"language:hac",
"language:had",
"language:hae",
"language:hag",
"language:hah",
"language:hai",
"language:haj",
"language:hak",
"language:hal",
"language:ham",
"language:han",
"language:hao",
"language:haq",
"language:har",
"language:has",
"language:hat",
"language:hau",
"language:hav",
"language:haw",
"language:hax",
"language:hay",
"language:haz",
"language:hbb",
"language:hbn",
"language:hbo",
"language:hbs",
"language:hbu",
"language:hca",
"language:hch",
"language:hdn",
"language:hdy",
"language:hea",
"language:heb",
"language:hed",
"language:heg",
"language:heh",
"language:hei",
"language:hem",
"language:her",
"language:hgm",
"language:hia",
"language:hib",
"language:hid",
"language:hif",
"language:hig",
"language:hih",
"language:hik",
"language:hil",
"language:hin",
"language:hio",
"language:hit",
"language:hiw",
"language:hix",
"language:hka",
"language:hke",
"language:hkk",
"language:hla",
"language:hlb",
"language:hlu",
"language:hmb",
"language:hmd",
"language:hml",
"language:hmn",
"language:hmo",
"language:hmq",
"language:hmr",
"language:hms",
"language:hmt",
"language:hmu",
"language:hmv",
"language:hna",
"language:hnd",
"language:hne",
"language:hng",
"language:hnh",
"language:hni",
"language:hnj",
"language:hnn",
"language:hno",
"language:hns",
"language:hnu",
"language:hoa",
"language:hob",
"language:hoc",
"language:hod",
"language:hoe",
"language:hoh",
"language:hoi",
"language:hoj",
"language:hol",
"language:hoo",
"language:hop",
"language:hot",
"language:hov",
"language:how",
"language:hoz",
"language:hpo",
"language:hra",
"language:hrc",
"language:hre",
"language:hrk",
"language:hro",
"language:hrt",
"language:hru",
"language:hrv",
"language:hrw",
"language:hrx",
"language:hrz",
"language:hsb",
"language:hsn",
"language:hss",
"language:hti",
"language:hto",
"language:hts",
"language:htu",
"language:htx",
"language:hub",
"language:huc",
"language:hud",
"language:hue",
"language:huf",
"language:hug",
"language:huh",
"language:hui",
"language:huk",
"language:hul",
"language:hum",
"language:hun",
"language:huo",
"language:hup",
"language:huq",
"language:hur",
"language:hus",
"language:hut",
"language:huu",
"language:huv",
"language:huw",
"language:hux",
"language:huz",
"language:hvc",
"language:hve",
"language:hvn",
"language:hvv",
"language:hwa",
"language:hwc",
"language:hwo",
"language:hya",
"language:hye",
"language:hyw",
"language:iai",
"language:ian",
"language:iar",
"language:iba",
"language:ibb",
"language:ibd",
"language:ibe",
"language:ibg",
"language:ibl",
"language:ibn",
"language:ibo",
"language:ibr",
"language:ibu",
"language:iby",
"language:ica",
"language:ich",
"language:ida",
"language:idb",
"language:idc",
"language:idd",
"language:idi",
"language:ido",
"language:idr",
"language:idt",
"language:idu",
"language:ifa",
"language:ifb",
"language:ife",
"language:iff",
"language:ifk",
"language:ifm",
"language:ifu",
"language:ify",
"language:igb",
"language:ige",
"language:igl",
"language:igm",
"language:ign",
"language:igo",
"language:igs",
"language:ihp",
"language:iii",
"language:iin",
"language:ijc",
"language:ije",
"language:ijj",
"language:ijn",
"language:ijs",
"language:ike",
"language:iki",
"language:ikl",
"language:iko",
"language:ikt",
"language:iku",
"language:ikw",
"language:ikx",
"language:ikz",
"language:ilb",
"language:ile",
"language:ilk",
"language:ilo",
"language:ilp",
"language:ilu",
"language:ilv",
"language:imi",
"language:iml",
"language:imn",
"language:imo",
"language:imr",
"language:ims",
"language:ina",
"language:inb",
"language:ind",
"language:ing",
"language:inh",
"language:inj",
"language:inm",
"language:inn",
"language:ino",
"language:inp",
"language:int",
"language:inz",
"language:ior",
"language:iou",
"language:iow",
"language:ipi",
"language:ipk",
"language:ipo",
"language:iqu",
"language:ire",
"language:irh",
"language:iri",
"language:irk",
"language:irn",
"language:iru",
"language:irx",
"language:iry",
"language:isa",
"language:isc",
"language:isd",
"language:isi",
"language:isk",
"language:isl",
"language:iso",
"language:ist",
"language:isu",
"language:ita",
"language:itb",
"language:itd",
"language:ite",
"language:iti",
"language:itk",
"language:itl",
"language:itm",
"language:ito",
"language:itr",
"language:itv",
"language:itw",
"language:itx",
"language:itz",
"language:ium",
"language:ivb",
"language:ivv",
"language:iwm",
"language:iws",
"language:ixc",
"language:ixl",
"language:iyo",
"language:iyx",
"language:izh",
"language:izr",
"language:izz",
"language:jaa",
"language:jab",
"language:jac",
"language:jae",
"language:jaf",
"language:jah",
"language:jaj",
"language:jak",
"language:jal",
"language:jam",
"language:jao",
"language:jaq",
"language:jau",
"language:jav",
"language:jax",
"language:jay",
"language:jaz",
"language:jbe",
"language:jbi",
"language:jbj",
"language:jbk",
"language:jbn",
"language:jbo",
"language:jbt",
"language:jbu",
"language:jct",
"language:jdt",
"language:jeb",
"language:jee",
"language:jeh",
"language:jei",
"language:jek",
"language:jel",
"language:jen",
"language:jer",
"language:jet",
"language:jeu",
"language:jge",
"language:jgk",
"language:jgo",
"language:jhi",
"language:jia",
"language:jib",
"language:jic",
"language:jid",
"language:jig",
"language:jii",
"language:jil",
"language:jim",
"language:jio",
"language:jit",
"language:jiu",
"language:jiv",
"language:jjr",
"language:jka",
"language:jkm",
"language:jko",
"language:jkp",
"language:jku",
"language:jle",
"language:jma",
"language:jmb",
"language:jmc",
"language:jmd",
"language:jmi",
"language:jml",
"language:jmr",
"language:jms",
"language:jmx",
"language:jna",
"language:jng",
"language:jni",
"language:jnj",
"language:jns",
"language:job",
"language:jod",
"language:jor",
"language:jow",
"language:jpa",
"language:jpn",
"language:jqr",
"language:jra",
"language:jrb",
"language:jrr",
"language:jru",
"language:jua",
"language:jub",
"language:juc",
"language:jud",
"language:juh",
"language:juk",
"language:jul",
"language:jum",
"language:jun",
"language:juo",
"language:jup",
"language:jur",
"language:jut",
"language:juu",
"language:juw",
"language:juy",
"language:jvn",
"language:jya",
"language:kaa",
"language:kab",
"language:kac",
"language:kad",
"language:kae",
"language:kag",
"language:kah",
"language:kai",
"language:kaj",
"language:kak",
"language:kal",
"language:kam",
"language:kan",
"language:kao",
"language:kap",
"language:kaq",
"language:kas",
"language:kat",
"language:kau",
"language:kav",
"language:kaw",
"language:kay",
"language:kaz",
"language:kbb",
"language:kbc",
"language:kbd",
"language:kbe",
"language:kbh",
"language:kbi",
"language:kbj",
"language:kbk",
"language:kbl",
"language:kbm",
"language:kbn",
"language:kbo",
"language:kbp",
"language:kbq",
"language:kbr",
"language:kbs",
"language:kbt",
"language:kbv",
"language:kbw",
"language:kbx",
"language:kby",
"language:kbz",
"language:kca",
"language:kcb",
"language:kcd",
"language:kcf",
"language:kcg",
"language:kci",
"language:kcj",
"language:kck",
"language:kcl",
"language:kcm",
"language:kcn",
"language:kco",
"language:kcp",
"language:kcr",
"language:kct",
"language:kcu",
"language:kcv",
"language:kcw",
"language:kcx",
"language:kcy",
"language:kda",
"language:kdc",
"language:kdd",
"language:kde",
"language:kdf",
"language:kdh",
"language:kdi",
"language:kdj",
"language:kdk",
"language:kdl",
"language:kdm",
"language:kdn",
"language:kdp",
"language:kdq",
"language:kdr",
"language:kdt",
"language:kdu",
"language:kdw",
"language:kdx",
"language:kdz",
"language:kea",
"language:keb",
"language:kec",
"language:ked",
"language:kee",
"language:kef",
"language:keg",
"language:kei",
"language:kek",
"language:kel",
"language:kem",
"language:ken",
"language:keo",
"language:ker",
"language:kes",
"language:ket",
"language:keu",
"language:kew",
"language:kex",
"language:kez",
"language:kfa",
"language:kfb",
"language:kfc",
"language:kfd",
"language:kfe",
"language:kff",
"language:kfi",
"language:kfk",
"language:kfm",
"language:kfn",
"language:kfo",
"language:kfp",
"language:kfq",
"language:kfr",
"language:kfy",
"language:kfz",
"language:kga",
"language:kgb",
"language:kge",
"language:kgf",
"language:kgg",
"language:kgj",
"language:kgk",
"language:kgl",
"language:kgo",
"language:kgp",
"language:kgq",
"language:kgr",
"language:kgs",
"language:kgt",
"language:kgu",
"language:kgv",
"language:kgw",
"language:kgx",
"language:kgy",
"language:kha",
"language:khb",
"language:khc",
"language:khe",
"language:khf",
"language:khg",
"language:khh",
"language:khj",
"language:khk",
"language:khl",
"language:khm",
"language:khn",
"language:kho",
"language:khp",
"language:khq",
"language:khr",
"language:khs",
"language:kht",
"language:khu",
"language:khv",
"language:khw",
"language:khy",
"language:khz",
"language:kia",
"language:kib",
"language:kic",
"language:kid",
"language:kie",
"language:kig",
"language:kih",
"language:kii",
"language:kij",
"language:kik",
"language:kil",
"language:kim",
"language:kin",
"language:kio",
"language:kiq",
"language:kir",
"language:kis",
"language:kit",
"language:kiu",
"language:kiv",
"language:kiw",
"language:kiy",
"language:kiz",
"language:kja",
"language:kjb",
"language:kjc",
"language:kjd",
"language:kje",
"language:kjg",
"language:kjh",
"language:kji",
"language:kjj",
"language:kjk",
"language:kjl",
"language:kjm",
"language:kjn",
"language:kjp",
"language:kjq",
"language:kjr",
"language:kjs",
"language:kju",
"language:kjx",
"language:kjy",
"language:kjz",
"language:kka",
"language:kkb",
"language:kkc",
"language:kke",
"language:kkh",
"language:kki",
"language:kkj",
"language:kkk",
"language:kkl",
"language:kkm",
"language:kko",
"language:kkp",
"language:kks",
"language:kkw",
"language:kky",
"language:kkz",
"language:kla",
"language:klb",
"language:klc",
"language:kld",
"language:kle",
"language:klg",
"language:klh",
"language:kli",
"language:klj",
"language:kll",
"language:klm",
"language:kln",
"language:klp",
"language:klq",
"language:klr",
"language:kls",
"language:klt",
"language:klu",
"language:klv",
"language:klw",
"language:klx",
"language:kly",
"language:klz",
"language:kma",
"language:kmb",
"language:kmc",
"language:kmd",
"language:kmf",
"language:kmg",
"language:kmh",
"language:kmi",
"language:kmk",
"language:kml",
"language:kmm",
"language:kmn",
"language:kmo",
"language:kmq",
"language:kmr",
"language:kms",
"language:kmt",
"language:kmu",
"language:kmv",
"language:kmw",
"language:kmx",
"language:kmz",
"language:kna",
"language:knb",
"language:knc",
"language:knd",
"language:kne",
"language:knf",
"language:kng",
"language:kni",
"language:knj",
"language:knk",
"language:knm",
"language:knn",
"language:kno",
"language:knp",
"language:knq",
"language:knr",
"language:kns",
"language:knt",
"language:knv",
"language:knw",
"language:knx",
"language:kny",
"language:kod",
"language:koe",
"language:kof",
"language:kog",
"language:koh",
"language:koi",
"language:kok",
"language:kol",
"language:kon",
"language:koo",
"language:kop",
"language:koq",
"language:kor",
"language:kos",
"language:kot",
"language:kou",
"language:kov",
"language:koy",
"language:koz",
"language:kpc",
"language:kpe",
"language:kpf",
"language:kpg",
"language:kph",
"language:kpj",
"language:kpk",
"language:kpl",
"language:kpm",
"language:kpn",
"language:kpo",
"language:kpq",
"language:kpr",
"language:kps",
"language:kpt",
"language:kpu",
"language:kpv",
"language:kpw",
"language:kpx",
"language:kpy",
"language:kpz",
"language:kqa",
"language:kqb",
"language:kqc",
"language:kqe",
"language:kqf",
"language:kqi",
"language:kqj",
"language:kql",
"language:kqn",
"language:kqo",
"language:kqq",
"language:kqr",
"language:kqs",
"language:kqt",
"language:kqu",
"language:kqv",
"language:kqw",
"language:kqx",
"language:kqy",
"language:kqz",
"language:krb",
"language:krc",
"language:krd",
"language:kre",
"language:krf",
"language:krh",
"language:kri",
"language:krj",
"language:krk",
"language:krl",
"language:krp",
"language:krr",
"language:krs",
"language:kru",
"language:krx",
"language:kry",
"language:krz",
"language:ksb",
"language:ksc",
"language:ksd",
"language:kse",
"language:ksf",
"language:ksg",
"language:ksh",
"language:ksi",
"language:ksj",
"language:ksk",
"language:ksl",
"language:ksm",
"language:ksn",
"language:ksp",
"language:ksq",
"language:ksr",
"language:kss",
"language:kst",
"language:ksv",
"language:ksw",
"language:ksx",
"language:ksz",
"language:ktb",
"language:ktd",
"language:kte",
"language:ktg",
"language:kth",
"language:kti",
"language:ktj",
"language:ktk",
"language:ktm",
"language:ktn",
"language:kto",
"language:ktp",
"language:kts",
"language:ktt",
"language:ktu",
"language:ktv",
"language:ktw",
"language:ktx",
"language:ktz",
"language:kua",
"language:kub",
"language:kud",
"language:kue",
"language:kug",
"language:kui",
"language:kuj",
"language:kul",
"language:kum",
"language:kun",
"language:kuo",
"language:kup",
"language:kuq",
"language:kus",
"language:kut",
"language:kuu",
"language:kux",
"language:kuy",
"language:kuz",
"language:kva",
"language:kvb",
"language:kvc",
"language:kvd",
"language:kve",
"language:kvf",
"language:kvg",
"language:kvh",
"language:kvj",
"language:kvm",
"language:kvn",
"language:kvo",
"language:kvq",
"language:kvr",
"language:kvu",
"language:kvv",
"language:kvw",
"language:kvy",
"language:kwa",
"language:kwb",
"language:kwd",
"language:kwe",
"language:kwf",
"language:kwg",
"language:kwh",
"language:kwi",
"language:kwj",
"language:kwk",
"language:kwl",
"language:kwn",
"language:kwo",
"language:kwp",
"language:kws",
"language:kwv",
"language:kwy",
"language:kwz",
"language:kxa",
"language:kxb",
"language:kxc",
"language:kxd",
"language:kxf",
"language:kxh",
"language:kxi",
"language:kxj",
"language:kxm",
"language:kxn",
"language:kxo",
"language:kxv",
"language:kxw",
"language:kxz",
"language:kya",
"language:kyc",
"language:kye",
"language:kyf",
"language:kyh",
"language:kyi",
"language:kyj",
"language:kyk",
"language:kyl",
"language:kym",
"language:kyo",
"language:kyq",
"language:kyr",
"language:kys",
"language:kyt",
"language:kyu",
"language:kyx",
"language:kyz",
"language:kza",
"language:kzb",
"language:kzc",
"language:kzd",
"language:kze",
"language:kzf",
"language:kzg",
"language:kzi",
"language:kzk",
"language:kzl",
"language:kzm",
"language:kzn",
"language:kzo",
"language:kzp",
"language:kzq",
"language:kzr",
"language:kzs",
"language:kzu",
"language:kzw",
"language:kzx",
"language:kzz",
"language:laa",
"language:lac",
"language:lad",
"language:lae",
"language:laf",
"language:lag",
"language:lah",
"language:lai",
"language:laj",
"language:lam",
"language:lan",
"language:lao",
"language:lap",
"language:laq",
"language:lar",
"language:las",
"language:lat",
"language:lau",
"language:law",
"language:lax",
"language:laz",
"language:lbb",
"language:lbc",
"language:lbe",
"language:lbf",
"language:lbj",
"language:lbk",
"language:lbl",
"language:lbn",
"language:lbo",
"language:lbq",
"language:lbr",
"language:lbt",
"language:lbu",
"language:lbv",
"language:lbw",
"language:lbx",
"language:lby",
"language:lbz",
"language:lcc",
"language:lch",
"language:lcm",
"language:lcp",
"language:lcq",
"language:lcs",
"language:lda",
"language:ldb",
"language:ldi",
"language:ldj",
"language:ldk",
"language:ldl",
"language:ldm",
"language:ldn",
"language:lea",
"language:leb",
"language:lec",
"language:led",
"language:lee",
"language:lef",
"language:leh",
"language:lei",
"language:lej",
"language:lek",
"language:lem",
"language:len",
"language:lep",
"language:leq",
"language:ler",
"language:les",
"language:let",
"language:leu",
"language:lev",
"language:lew",
"language:lex",
"language:lez",
"language:lfa",
"language:lfn",
"language:lga",
"language:lgb",
"language:lgg",
"language:lgh",
"language:lgi",
"language:lgk",
"language:lgl",
"language:lgn",
"language:lgq",
"language:lgr",
"language:lgt",
"language:lgu",
"language:lha",
"language:lhh",
"language:lhi",
"language:lhm",
"language:lhn",
"language:lhs",
"language:lht",
"language:lhu",
"language:lia",
"language:lib",
"language:lic",
"language:lid",
"language:lif",
"language:lig",
"language:lih",
"language:lij",
"language:lil",
"language:lim",
"language:lin",
"language:lio",
"language:lip",
"language:liq",
"language:lir",
"language:lis",
"language:lit",
"language:liu",
"language:liv",
"language:lix",
"language:liy",
"language:liz",
"language:lja",
"language:lje",
"language:lji",
"language:ljl",
"language:ljp",
"language:lka",
"language:lkb",
"language:lkd",
"language:lke",
"language:lkh",
"language:lki",
"language:lkl",
"language:lkn",
"language:lkr",
"language:lkt",
"language:lku",
"language:lky",
"language:lla",
"language:llc",
"language:lld",
"language:lle",
"language:llf",
"language:llg",
"language:lli",
"language:llk",
"language:lll",
"language:llm",
"language:lln",
"language:llp",
"language:llu",
"language:lma",
"language:lmb",
"language:lmc",
"language:lme",
"language:lmg",
"language:lmh",
"language:lmk",
"language:lml",
"language:lmn",
"language:lmo",
"language:lmp",
"language:lmr",
"language:lmu",
"language:lmw",
"language:lmx",
"language:lmy",
"language:lna",
"language:lnb",
"language:lnd",
"language:lnh",
"language:lnj",
"language:lnl",
"language:lnn",
"language:lns",
"language:lnu",
"language:loa",
"language:lob",
"language:loc",
"language:loe",
"language:lof",
"language:log",
"language:loh",
"language:loi",
"language:loj",
"language:lok",
"language:lol",
"language:lom",
"language:lon",
"language:loo",
"language:lop",
"language:lor",
"language:los",
"language:lot",
"language:lou",
"language:loy",
"language:loz",
"language:lpa",
"language:lpe",
"language:lra",
"language:lrc",
"language:lre",
"language:lrg",
"language:lrl",
"language:lro",
"language:lrv",
"language:lrz",
"language:lsa",
"language:lsd",
"language:lse",
"language:lsh",
"language:lsi",
"language:lsm",
"language:lsr",
"language:ltc",
"language:ltg",
"language:lti",
"language:ltn",
"language:ltz",
"language:lua",
"language:lub",
"language:luc",
"language:lud",
"language:lue",
"language:luf",
"language:lug",
"language:lui",
"language:luj",
"language:lul",
"language:lum",
"language:lun",
"language:luo",
"language:lup",
"language:luq",
"language:lur",
"language:lus",
"language:lut",
"language:luw",
"language:luy",
"language:lva",
"language:lvk",
"language:lvs",
"language:lwh",
"language:lwl",
"language:lwo",
"language:lww",
"language:lyg",
"language:lyn",
"language:lzh",
"language:lzl",
"language:lzz",
"language:maa",
"language:mad",
"language:mae",
"language:maf",
"language:mag",
"language:mah",
"language:mai",
"language:maj",
"language:mak",
"language:mal",
"language:mam",
"language:man",
"language:maq",
"language:mar",
"language:mas",
"language:mat",
"language:mau",
"language:mav",
"language:maw",
"language:max",
"language:maz",
"language:mbb",
"language:mbc",
"language:mbd",
"language:mbe",
"language:mbf",
"language:mbh",
"language:mbi",
"language:mbj",
"language:mbl",
"language:mbm",
"language:mbn",
"language:mbo",
"language:mbp",
"language:mbq",
"language:mbr",
"language:mbs",
"language:mbt",
"language:mbu",
"language:mbv",
"language:mbw",
"language:mbx",
"language:mbz",
"language:mca",
"language:mcb",
"language:mcc",
"language:mcd",
"language:mcf",
"language:mcg",
"language:mch",
"language:mci",
"language:mcj",
"language:mck",
"language:mcm",
"language:mcn",
"language:mco",
"language:mcp",
"language:mcq",
"language:mcr",
"language:mcs",
"language:mct",
"language:mcu",
"language:mcv",
"language:mcw",
"language:mcx",
"language:mcy",
"language:mcz",
"language:mda",
"language:mdb",
"language:mdc",
"language:mdd",
"language:mde",
"language:mdf",
"language:mdh",
"language:mdi",
"language:mdj",
"language:mdm",
"language:mdp",
"language:mdq",
"language:mdr",
"language:mds",
"language:mdt",
"language:mdv",
"language:mdw",
"language:mdx",
"language:mdy",
"language:mdz",
"language:mea",
"language:meb",
"language:mec",
"language:med",
"language:mee",
"language:mef",
"language:mei",
"language:mej",
"language:mek",
"language:mel",
"language:mem",
"language:men",
"language:meo",
"language:mep",
"language:meq",
"language:mer",
"language:met",
"language:meu",
"language:mev",
"language:mew",
"language:mey",
"language:mez",
"language:mfa",
"language:mfb",
"language:mfc",
"language:mfd",
"language:mfe",
"language:mff",
"language:mfg",
"language:mfh",
"language:mfi",
"language:mfj",
"language:mfl",
"language:mfm",
"language:mfn",
"language:mfo",
"language:mfp",
"language:mfq",
"language:mfr",
"language:mft",
"language:mfu",
"language:mfv",
"language:mfw",
"language:mfx",
"language:mfy",
"language:mfz",
"language:mga",
"language:mgb",
"language:mgc",
"language:mgd",
"language:mge",
"language:mgf",
"language:mgh",
"language:mgi",
"language:mgk",
"language:mgl",
"language:mgm",
"language:mgo",
"language:mgp",
"language:mgq",
"language:mgr",
"language:mgs",
"language:mgu",
"language:mgv",
"language:mgw",
"language:mgy",
"language:mgz",
"language:mha",
"language:mhb",
"language:mhc",
"language:mhd",
"language:mhe",
"language:mhf",
"language:mhg",
"language:mhi",
"language:mhj",
"language:mhk",
"language:mhl",
"language:mhn",
"language:mho",
"language:mhq",
"language:mhr",
"language:mhs",
"language:mht",
"language:mhu",
"language:mhw",
"language:mhx",
"language:mhy",
"language:mhz",
"language:mia",
"language:mib",
"language:mic",
"language:mid",
"language:mie",
"language:mif",
"language:mig",
"language:mih",
"language:mij",
"language:mik",
"language:mil",
"language:mim",
"language:min",
"language:mio",
"language:mip",
"language:miq",
"language:mir",
"language:mit",
"language:miu",
"language:miw",
"language:mix",
"language:mjc",
"language:mjd",
"language:mjg",
"language:mjh",
"language:mji",
"language:mjj",
"language:mjk",
"language:mjm",
"language:mjs",
"language:mjt",
"language:mjw",
"language:mjx",
"language:mjy",
"language:mka",
"language:mkc",
"language:mkd",
"language:mke",
"language:mkf",
"language:mkg",
"language:mki",
"language:mkj",
"language:mkl",
"language:mkm",
"language:mkn",
"language:mkp",
"language:mkq",
"language:mkr",
"language:mks",
"language:mkv",
"language:mkw",
"language:mky",
"language:mkz",
"language:mla",
"language:mlc",
"language:mle",
"language:mlf",
"language:mlg",
"language:mlh",
"language:mlj",
"language:mlk",
"language:mll",
"language:mlm",
"language:mln",
"language:mlp",
"language:mlq",
"language:mlr",
"language:mls",
"language:mlt",
"language:mlu",
"language:mlv",
"language:mlw",
"language:mlx",
"language:mma",
"language:mmb",
"language:mmd",
"language:mme",
"language:mmf",
"language:mmg",
"language:mmh",
"language:mmi",
"language:mmm",
"language:mmn",
"language:mmo",
"language:mmp",
"language:mmq",
"language:mmr",
"language:mmt",
"language:mmu",
"language:mmw",
"language:mmx",
"language:mmy",
"language:mmz",
"language:mna",
"language:mnb",
"language:mnc",
"language:mnd",
"language:mne",
"language:mnf",
"language:mng",
"language:mnh",
"language:mni",
"language:mnj",
"language:mnk",
"language:mnl",
"language:mnm",
"language:mnp",
"language:mnq",
"language:mnr",
"language:mns",
"language:mnv",
"language:mnw",
"language:mnx",
"language:mny",
"language:mnz",
"language:moa",
"language:moc",
"language:mod",
"language:moe",
"language:mog",
"language:moh",
"language:moi",
"language:moj",
"language:mok",
"language:mom",
"language:mop",
"language:moq",
"language:mor",
"language:mos",
"language:mot",
"language:mou",
"language:mov",
"language:mox",
"language:moy",
"language:moz",
"language:mpa",
"language:mpb",
"language:mpc",
"language:mpd",
"language:mpe",
"language:mpg",
"language:mph",
"language:mpi",
"language:mpj",
"language:mpk",
"language:mpl",
"language:mpm",
"language:mpn",
"language:mpp",
"language:mpq",
"language:mpr",
"language:mps",
"language:mpt",
"language:mpu",
"language:mpv",
"language:mpw",
"language:mpx",
"language:mpy",
"language:mpz",
"language:mqa",
"language:mqb",
"language:mqc",
"language:mqe",
"language:mqf",
"language:mqj",
"language:mqk",
"language:mql",
"language:mqm",
"language:mqn",
"language:mqo",
"language:mqp",
"language:mqr",
"language:mqs",
"language:mqu",
"language:mqv",
"language:mqw",
"language:mqx",
"language:mqy",
"language:mqz",
"language:mra",
"language:mrb",
"language:mrc",
"language:mrf",
"language:mrg",
"language:mrh",
"language:mri",
"language:mrj",
"language:mrk",
"language:mrl",
"language:mrm",
"language:mrn",
"language:mro",
"language:mrp",
"language:mrq",
"language:mrr",
"language:mrs",
"language:mrt",
"language:mru",
"language:mrv",
"language:mrw",
"language:mrx",
"language:mry",
"language:mrz",
"language:msa",
"language:msb",
"language:mse",
"language:msf",
"language:msg",
"language:msh",
"language:msj",
"language:msk",
"language:msl",
"language:msm",
"language:msn",
"language:mso",
"language:mss",
"language:msu",
"language:msw",
"language:msx",
"language:msy",
"language:msz",
"language:mta",
"language:mtc",
"language:mtd",
"language:mte",
"language:mtf",
"language:mtg",
"language:mth",
"language:mti",
"language:mtj",
"language:mtk",
"language:mtl",
"language:mtm",
"language:mtn",
"language:mto",
"language:mtp",
"language:mtq",
"language:mtr",
"language:mts",
"language:mtt",
"language:mtu",
"language:mtv",
"language:mty",
"language:mua",
"language:mub",
"language:muc",
"language:mue",
"language:mug",
"language:muh",
"language:mui",
"language:muk",
"language:mul",
"language:muo",
"language:mup",
"language:mur",
"language:mus",
"language:muu",
"language:mux",
"language:muy",
"language:muz",
"language:mva",
"language:mvb",
"language:mvd",
"language:mve",
"language:mvf",
"language:mvi",
"language:mvn",
"language:mvo",
"language:mvp",
"language:mvq",
"language:mvr",
"language:mvt",
"language:mvu",
"language:mvv",
"language:mvx",
"language:mvy",
"language:mvz",
"language:mwa",
"language:mwb",
"language:mwc",
"language:mwe",
"language:mwf",
"language:mwg",
"language:mwh",
"language:mwi",
"language:mwl",
"language:mwm",
"language:mwn",
"language:mwo",
"language:mwp",
"language:mwq",
"language:mwr",
"language:mwt",
"language:mwv",
"language:mww",
"language:mxb",
"language:mxc",
"language:mxd",
"language:mxe",
"language:mxf",
"language:mxg",
"language:mxi",
"language:mxj",
"language:mxk",
"language:mxm",
"language:mxn",
"language:mxp",
"language:mxq",
"language:mxr",
"language:mxt",
"language:mxu",
"language:mxv",
"language:mxw",
"language:mxx",
"language:mxy",
"language:mxz",
"language:mya",
"language:myb",
"language:mye",
"language:myf",
"language:myg",
"language:myh",
"language:myj",
"language:myk",
"language:myl",
"language:mym",
"language:myo",
"language:myp",
"language:myr",
"language:mys",
"language:myu",
"language:myv",
"language:myw",
"language:myx",
"language:myy",
"language:myz",
"language:mzb",
"language:mzd",
"language:mze",
"language:mzh",
"language:mzi",
"language:mzj",
"language:mzk",
"language:mzm",
"language:mzn",
"language:mzo",
"language:mzp",
"language:mzq",
"language:mzr",
"language:mzs",
"language:mzt",
"language:mzu",
"language:mzv",
"language:mzw",
"language:mzz",
"language:nab",
"language:nac",
"language:nae",
"language:naf",
"language:nag",
"language:naj",
"language:nak",
"language:nal",
"language:nam",
"language:nap",
"language:naq",
"language:nar",
"language:nas",
"language:nat",
"language:nau",
"language:nav",
"language:naw",
"language:nax",
"language:nay",
"language:naz",
"language:nba",
"language:nbb",
"language:nbc",
"language:nbe",
"language:nbh",
"language:nbi",
"language:nbj",
"language:nbk",
"language:nbl",
"language:nbm",
"language:nbn",
"language:nbo",
"language:nbp",
"language:nbq",
"language:nbr",
"language:nbu",
"language:nbv",
"language:nbw",
"language:nby",
"language:nca",
"language:ncb",
"language:ncc",
"language:ncd",
"language:nce",
"language:ncf",
"language:ncg",
"language:nch",
"language:nci",
"language:ncj",
"language:nck",
"language:ncl",
"language:ncm",
"language:ncn",
"language:nco",
"language:ncr",
"language:nct",
"language:ncu",
"language:ncx",
"language:ncz",
"language:nda",
"language:ndb",
"language:ndc",
"language:ndd",
"language:nde",
"language:ndg",
"language:ndh",
"language:ndi",
"language:ndj",
"language:ndm",
"language:ndn",
"language:ndo",
"language:ndq",
"language:ndr",
"language:nds",
"language:ndt",
"language:ndu",
"language:ndv",
"language:ndx",
"language:ndy",
"language:ndz",
"language:neb",
"language:nec",
"language:nee",
"language:neg",
"language:neh",
"language:nej",
"language:nem",
"language:nen",
"language:neo",
"language:neq",
"language:ner",
"language:net",
"language:neu",
"language:nev",
"language:new",
"language:ney",
"language:nez",
"language:nfa",
"language:nfd",
"language:nfl",
"language:nfr",
"language:nfu",
"language:nga",
"language:ngb",
"language:ngc",
"language:nge",
"language:ngg",
"language:ngh",
"language:ngi",
"language:ngj",
"language:ngk",
"language:ngl",
"language:ngn",
"language:ngp",
"language:ngq",
"language:ngr",
"language:ngs",
"language:ngt",
"language:ngu",
"language:ngv",
"language:ngx",
"language:ngy",
"language:ngz",
"language:nha",
"language:nhb",
"language:nhc",
"language:nhd",
"language:nhe",
"language:nhf",
"language:nhg",
"language:nhi",
"language:nhm",
"language:nhn",
"language:nho",
"language:nhp",
"language:nhq",
"language:nhr",
"language:nht",
"language:nhu",
"language:nhv",
"language:nhw",
"language:nhx",
"language:nhy",
"language:nia",
"language:nib",
"language:nid",
"language:nie",
"language:nif",
"language:nig",
"language:nih",
"language:nii",
"language:nij",
"language:nil",
"language:nim",
"language:nin",
"language:nio",
"language:niq",
"language:nir",
"language:nis",
"language:nit",
"language:niu",
"language:niv",
"language:niw",
"language:niy",
"language:niz",
"language:nja",
"language:njb",
"language:njh",
"language:nji",
"language:njj",
"language:njl",
"language:njm",
"language:njn",
"language:njo",
"language:njr",
"language:nju",
"language:njy",
"language:njz",
"language:nka",
"language:nkb",
"language:nkc",
"language:nkd",
"language:nke",
"language:nkf",
"language:nkg",
"language:nkh",
"language:nki",
"language:nkj",
"language:nkk",
"language:nko",
"language:nkp",
"language:nkr",
"language:nks",
"language:nku",
"language:nkw",
"language:nkx",
"language:nkz",
"language:nla",
"language:nlc",
"language:nld",
"language:nle",
"language:nlg",
"language:nli",
"language:nlj",
"language:nll",
"language:nlo",
"language:nlu",
"language:nlx",
"language:nly",
"language:nlz",
"language:nma",
"language:nmb",
"language:nmc",
"language:nmd",
"language:nme",
"language:nmf",
"language:nmg",
"language:nmh",
"language:nmj",
"language:nmk",
"language:nml",
"language:nmm",
"language:nmn",
"language:nmp",
"language:nms",
"language:nmt",
"language:nmu",
"language:nmv",
"language:nmw",
"language:nmx",
"language:nmy",
"language:nmz",
"language:nna",
"language:nnb",
"language:nnc",
"language:nnd",
"language:nnf",
"language:nng",
"language:nnh",
"language:nnj",
"language:nnk",
"language:nnl",
"language:nnm",
"language:nno",
"language:nnp",
"language:nnq",
"language:nnr",
"language:nnt",
"language:nnu",
"language:nnv",
"language:nnw",
"language:nny",
"language:nnz",
"language:noa",
"language:nob",
"language:noc",
"language:nod",
"language:noe",
"language:nog",
"language:noh",
"language:noj",
"language:nok",
"language:nol",
"language:nom",
"language:non",
"language:nop",
"language:noq",
"language:nor",
"language:not",
"language:nou",
"language:nov",
"language:now",
"language:noz",
"language:nph",
"language:npi",
"language:npl",
"language:npn",
"language:npo",
"language:nps",
"language:npy",
"language:nqg",
"language:nqk",
"language:nqm",
"language:nqo",
"language:nra",
"language:nrb",
"language:nre",
"language:nrf",
"language:nrg",
"language:nri",
"language:nrk",
"language:nrl",
"language:nrm",
"language:nrn",
"language:nrt",
"language:nru",
"language:nrz",
"language:nsa",
"language:nsb",
"language:nse",
"language:nsg",
"language:nsk",
"language:nsm",
"language:nsn",
"language:nso",
"language:nsq",
"language:nss",
"language:nst",
"language:nsu",
"language:nsw",
"language:nsx",
"language:nsy",
"language:nsz",
"language:nti",
"language:ntj",
"language:ntk",
"language:ntm",
"language:nto",
"language:ntp",
"language:ntu",
"language:ntw",
"language:nty",
"language:ntz",
"language:nua",
"language:nuc",
"language:nud",
"language:nuf",
"language:nug",
"language:nui",
"language:nuj",
"language:nuk",
"language:nul",
"language:num",
"language:nun",
"language:nuo",
"language:nup",
"language:nur",
"language:nus",
"language:nut",
"language:nuw",
"language:nux",
"language:nuy",
"language:nuz",
"language:nvh",
"language:nvo",
"language:nwa",
"language:nwb",
"language:nwe",
"language:nwi",
"language:nwr",
"language:nwy",
"language:nxa",
"language:nxe",
"language:nxg",
"language:nxi",
"language:nxl",
"language:nxn",
"language:nxq",
"language:nxr",
"language:nxx",
"language:nya",
"language:nyb",
"language:nyd",
"language:nye",
"language:nyf",
"language:nyh",
"language:nyi",
"language:nyj",
"language:nyk",
"language:nyl",
"language:nym",
"language:nyn",
"language:nyo",
"language:nyp",
"language:nyq",
"language:nys",
"language:nyt",
"language:nyu",
"language:nyv",
"language:nyx",
"language:nyy",
"language:nza",
"language:nzb",
"language:nzd",
"language:nzi",
"language:nzk",
"language:nzm",
"language:nzy",
"language:nzz",
"language:oaa",
"language:oac",
"language:oar",
"language:obi",
"language:obk",
"language:obm",
"language:obo",
"language:obr",
"language:obt",
"language:oca",
"language:och",
"language:oci",
"language:oco",
"language:ocu",
"language:odt",
"language:odu",
"language:ofo",
"language:ofs",
"language:ofu",
"language:ogb",
"language:ogc",
"language:oge",
"language:ogo",
"language:ogu",
"language:oht",
"language:oia",
"language:ojb",
"language:ojc",
"language:ojg",
"language:oji",
"language:ojp",
"language:ojs",
"language:ojv",
"language:ojw",
"language:oka",
"language:okb",
"language:okd",
"language:oke",
"language:oki",
"language:okj",
"language:okn",
"language:oko",
"language:okr",
"language:oks",
"language:oku",
"language:okv",
"language:ola",
"language:old",
"language:ole",
"language:olm",
"language:olo",
"language:olt",
"language:oma",
"language:omb",
"language:omc",
"language:omg",
"language:omk",
"language:oml",
"language:omo",
"language:omr",
"language:omt",
"language:omu",
"language:omw",
"language:omx",
"language:ona",
"language:onb",
"language:one",
"language:ong",
"language:oni",
"language:onj",
"language:onn",
"language:ono",
"language:onp",
"language:ons",
"language:onu",
"language:onw",
"language:ood",
"language:oog",
"language:oon",
"language:opa",
"language:opk",
"language:opm",
"language:opo",
"language:opt",
"language:opy",
"language:ora",
"language:orc",
"language:ore",
"language:org",
"language:orh",
"language:ori",
"language:orm",
"language:oro",
"language:orr",
"language:ort",
"language:oru",
"language:orv",
"language:orw",
"language:orx",
"language:ory",
"language:orz",
"language:osa",
"language:osc",
"language:osi",
"language:osp",
"language:oss",
"language:osx",
"language:ota",
"language:otd",
"language:ote",
"language:oti",
"language:otk",
"language:otl",
"language:otm",
"language:otq",
"language:otr",
"language:ots",
"language:ott",
"language:otu",
"language:otw",
"language:otz",
"language:oua",
"language:oub",
"language:oum",
"language:ovd",
"language:owi",
"language:owl",
"language:oyb",
"language:oyd",
"language:oym",
"language:ozm",
"language:pab",
"language:pac",
"language:pad",
"language:pae",
"language:paf",
"language:pag",
"language:pah",
"language:pai",
"language:pak",
"language:pal",
"language:pam",
"language:pan",
"language:pao",
"language:pap",
"language:paq",
"language:par",
"language:pas",
"language:pau",
"language:pav",
"language:paw",
"language:pay",
"language:paz",
"language:pbb",
"language:pbe",
"language:pbf",
"language:pbg",
"language:pbh",
"language:pbi",
"language:pbn",
"language:pbo",
"language:pbp",
"language:pbr",
"language:pbs",
"language:pbt",
"language:pbu",
"language:pbv",
"language:pby",
"language:pca",
"language:pcb",
"language:pcc",
"language:pcd",
"language:pce",
"language:pci",
"language:pcj",
"language:pck",
"language:pcm",
"language:pcn",
"language:pcp",
"language:pda",
"language:pdc",
"language:pdo",
"language:pdt",
"language:pdu",
"language:peb",
"language:ped",
"language:pee",
"language:pef",
"language:peg",
"language:peh",
"language:pei",
"language:pej",
"language:pek",
"language:pem",
"language:peo",
"language:pep",
"language:peq",
"language:pes",
"language:pex",
"language:pez",
"language:pfe",
"language:pfl",
"language:pga",
"language:pgd",
"language:pgk",
"language:pgn",
"language:pgu",
"language:pha",
"language:phg",
"language:phk",
"language:phl",
"language:phn",
"language:pho",
"language:pht",
"language:pia",
"language:pib",
"language:pic",
"language:pid",
"language:pie",
"language:pif",
"language:pih",
"language:pij",
"language:pil",
"language:pim",
"language:pin",
"language:pio",
"language:pip",
"language:pir",
"language:pis",
"language:pit",
"language:piu",
"language:piv",
"language:piw",
"language:pix",
"language:piz",
"language:pjt",
"language:pka",
"language:pkb",
"language:pkc",
"language:pkg",
"language:pkh",
"language:pkn",
"language:pko",
"language:pkp",
"language:pkt",
"language:pku",
"language:pla",
"language:plb",
"language:plc",
"language:ple",
"language:plg",
"language:plh",
"language:pli",
"language:plj",
"language:plk",
"language:pll",
"language:pln",
"language:plo",
"language:plq",
"language:plr",
"language:pls",
"language:plt",
"language:plu",
"language:plv",
"language:plw",
"language:ply",
"language:plz",
"language:pma",
"language:pmb",
"language:pmf",
"language:pmh",
"language:pmi",
"language:pmk",
"language:pml",
"language:pmm",
"language:pmn",
"language:pmo",
"language:pmq",
"language:pmr",
"language:pms",
"language:pmt",
"language:pmw",
"language:pmy",
"language:pmz",
"language:pna",
"language:pnb",
"language:pne",
"language:png",
"language:pnh",
"language:pni",
"language:pnj",
"language:pnn",
"language:pno",
"language:pnp",
"language:pnq",
"language:pnr",
"language:pns",
"language:pnt",
"language:pnu",
"language:pnv",
"language:pnw",
"language:pny",
"language:poc",
"language:poe",
"language:pof",
"language:pog",
"language:poh",
"language:poi",
"language:pok",
"language:pol",
"language:pom",
"language:pon",
"language:poo",
"language:poq",
"language:por",
"language:pos",
"language:pot",
"language:pov",
"language:pow",
"language:pox",
"language:poy",
"language:ppe",
"language:ppi",
"language:ppk",
"language:ppl",
"language:ppm",
"language:ppn",
"language:ppo",
"language:ppq",
"language:ppt",
"language:ppu",
"language:pqa",
"language:pqm",
"language:prc",
"language:prd",
"language:pre",
"language:prf",
"language:prg",
"language:prh",
"language:pri",
"language:prk",
"language:prm",
"language:prn",
"language:pro",
"language:prq",
"language:prr",
"language:prs",
"language:pru",
"language:prw",
"language:prx",
"language:psa",
"language:pse",
"language:psh",
"language:psi",
"language:psm",
"language:psn",
"language:psq",
"language:pss",
"language:pst",
"language:psu",
"language:psw",
"language:pta",
"language:pth",
"language:pti",
"language:ptn",
"language:pto",
"language:ptp",
"language:ptr",
"language:ptu",
"language:ptv",
"language:pua",
"language:pub",
"language:puc",
"language:pue",
"language:pui",
"language:puj",
"language:pum",
"language:puo",
"language:pup",
"language:puq",
"language:pur",
"language:puu",
"language:puw",
"language:puy",
"language:pwa",
"language:pwg",
"language:pwi",
"language:pwm",
"language:pwn",
"language:pwo",
"language:pym",
"language:pyn",
"language:pyu",
"language:pyy",
"language:qua",
"language:qub",
"language:quc",
"language:que",
"language:quf",
"language:qug",
"language:quh",
"language:qui",
"language:quk",
"language:qul",
"language:qum",
"language:qun",
"language:qup",
"language:qus",
"language:quv",
"language:quw",
"language:qux",
"language:quy",
"language:quz",
"language:qva",
"language:qvc",
"language:qve",
"language:qvi",
"language:qvl",
"language:qvn",
"language:qvo",
"language:qvs",
"language:qvw",
"language:qvy",
"language:qvz",
"language:qwa",
"language:qwh",
"language:qwm",
"language:qwt",
"language:qxa",
"language:qxn",
"language:qxo",
"language:qxp",
"language:qxq",
"language:qxr",
"language:qxs",
"language:qxu",
"language:qxw",
"language:qya",
"language:qyp",
"language:raa",
"language:rab",
"language:rac",
"language:rad",
"language:raf",
"language:rag",
"language:rah",
"language:rai",
"language:raj",
"language:rak",
"language:ral",
"language:ram",
"language:ran",
"language:rao",
"language:rap",
"language:rar",
"language:ras",
"language:rau",
"language:rav",
"language:raw",
"language:ray",
"language:raz",
"language:rbb",
"language:rbp",
"language:rcf",
"language:rea",
"language:reb",
"language:ree",
"language:reg",
"language:rej",
"language:rel",
"language:rem",
"language:ren",
"language:res",
"language:rey",
"language:rga",
"language:rgk",
"language:rgn",
"language:rgr",
"language:rgs",
"language:rhg",
"language:rhp",
"language:rif",
"language:ril",
"language:rim",
"language:rin",
"language:rir",
"language:rit",
"language:riu",
"language:rji",
"language:rjs",
"language:rkb",
"language:rkh",
"language:rki",
"language:rkm",
"language:rma",
"language:rmb",
"language:rmc",
"language:rme",
"language:rmf",
"language:rmg",
"language:rmh",
"language:rmi",
"language:rmk",
"language:rml",
"language:rmm",
"language:rmn",
"language:rmo",
"language:rmp",
"language:rmq",
"language:rmt",
"language:rmu",
"language:rmv",
"language:rmw",
"language:rmy",
"language:rnd",
"language:rng",
"language:rnn",
"language:rnp",
"language:rnr",
"language:rnw",
"language:rob",
"language:rod",
"language:roe",
"language:rof",
"language:rog",
"language:roh",
"language:rol",
"language:rom",
"language:ron",
"language:roo",
"language:rop",
"language:rou",
"language:rpn",
"language:rpt",
"language:rri",
"language:rro",
"language:rsb",
"language:rth",
"language:rtm",
"language:rtw",
"language:rub",
"language:rue",
"language:ruf",
"language:rug",
"language:rui",
"language:ruk",
"language:run",
"language:ruo",
"language:rup",
"language:ruq",
"language:rus",
"language:rut",
"language:ruu",
"language:rwa",
"language:rwk",
"language:rwm",
"language:rwo",
"language:rwr",
"language:ryn",
"language:rys",
"language:ryu",
"language:sab",
"language:sac",
"language:sad",
"language:sae",
"language:saf",
"language:sag",
"language:sah",
"language:saj",
"language:sak",
"language:sam",
"language:san",
"language:sao",
"language:saq",
"language:sar",
"language:sas",
"language:sat",
"language:sau",
"language:sav",
"language:saw",
"language:sax",
"language:say",
"language:saz",
"language:sba",
"language:sbb",
"language:sbc",
"language:sbd",
"language:sbe",
"language:sbf",
"language:sbg",
"language:sbh",
"language:sbi",
"language:sbk",
"language:sbl",
"language:sbm",
"language:sbo",
"language:sbp",
"language:sbq",
"language:sbr",
"language:sbs",
"language:sbt",
"language:sbu",
"language:sbv",
"language:sbw",
"language:scb",
"language:sce",
"language:scg",
"language:sci",
"language:sck",
"language:scl",
"language:scn",
"language:sco",
"language:scp",
"language:scq",
"language:scs",
"language:scu",
"language:scv",
"language:scw",
"language:sda",
"language:sdc",
"language:sde",
"language:sdg",
"language:sdh",
"language:sdj",
"language:sdn",
"language:sdo",
"language:sdp",
"language:sdq",
"language:sds",
"language:sdt",
"language:sdz",
"language:sea",
"language:sec",
"language:sed",
"language:see",
"language:sef",
"language:seh",
"language:sei",
"language:sej",
"language:sek",
"language:sel",
"language:sen",
"language:seo",
"language:seq",
"language:ser",
"language:ses",
"language:set",
"language:seu",
"language:sev",
"language:sew",
"language:sey",
"language:sga",
"language:sgc",
"language:sgd",
"language:sge",
"language:sgh",
"language:sgi",
"language:sgm",
"language:sgp",
"language:sgr",
"language:sgs",
"language:sgt",
"language:sgw",
"language:sgy",
"language:sgz",
"language:sha",
"language:shb",
"language:she",
"language:shg",
"language:shh",
"language:shi",
"language:shj",
"language:shk",
"language:shl",
"language:shn",
"language:sho",
"language:shp",
"language:shq",
"language:shr",
"language:shs",
"language:sht",
"language:shu",
"language:shv",
"language:shw",
"language:shx",
"language:shy",
"language:shz",
"language:sia",
"language:sib",
"language:sid",
"language:sie",
"language:sif",
"language:sig",
"language:sii",
"language:sij",
"language:sik",
"language:sil",
"language:sim",
"language:sin",
"language:sip",
"language:siq",
"language:sir",
"language:sis",
"language:siu",
"language:siw",
"language:six",
"language:siy",
"language:siz",
"language:sja",
"language:sjd",
"language:sje",
"language:sjg",
"language:sjk",
"language:sjl",
"language:sjm",
"language:sjn",
"language:sjo",
"language:sjr",
"language:sjs",
"language:sjt",
"language:sju",
"language:sjw",
"language:ska",
"language:skb",
"language:skc",
"language:skd",
"language:ske",
"language:skf",
"language:skg",
"language:skh",
"language:ski",
"language:skj",
"language:skm",
"language:skr",
"language:sks",
"language:skt",
"language:sku",
"language:skv",
"language:skw",
"language:skx",
"language:sky",
"language:skz",
"language:slc",
"language:slg",
"language:slh",
"language:sli",
"language:slj",
"language:slk",
"language:sll",
"language:slm",
"language:sln",
"language:slp",
"language:slr",
"language:slu",
"language:slv",
"language:slw",
"language:sly",
"language:slz",
"language:sma",
"language:smb",
"language:smc",
"language:sme",
"language:smg",
"language:smj",
"language:smk",
"language:sml",
"language:smn",
"language:smo",
"language:smp",
"language:smq",
"language:smr",
"language:sms",
"language:smw",
"language:smx",
"language:smy",
"language:sna",
"language:snc",
"language:snd",
"language:sne",
"language:snf",
"language:sng",
"language:sni",
"language:snk",
"language:snl",
"language:snn",
"language:sno",
"language:snp",
"language:snq",
"language:snr",
"language:sns",
"language:snu",
"language:snv",
"language:snw",
"language:snx",
"language:sny",
"language:snz",
"language:sob",
"language:soc",
"language:sod",
"language:soe",
"language:sog",
"language:soh",
"language:sok",
"language:sol",
"language:som",
"language:sop",
"language:soq",
"language:sor",
"language:sos",
"language:sot",
"language:sou",
"language:sov",
"language:sow",
"language:soz",
"language:spa",
"language:spb",
"language:spc",
"language:spd",
"language:spe",
"language:spi",
"language:spl",
"language:spm",
"language:spn",
"language:spo",
"language:spp",
"language:spr",
"language:sps",
"language:spu",
"language:spx",
"language:spy",
"language:sqa",
"language:sqi",
"language:sqn",
"language:sqo",
"language:sqq",
"language:sqr",
"language:sqt",
"language:squ",
"language:sra",
"language:srb",
"language:src",
"language:srd",
"language:srf",
"language:srh",
"language:sri",
"language:srk",
"language:srl",
"language:srm",
"language:srn",
"language:sro",
"language:srp",
"language:srq",
"language:srr",
"language:srs",
"language:sru",
"language:srv",
"language:srw",
"language:sry",
"language:srz",
"language:ssb",
"language:ssc",
"language:ssd",
"language:sse",
"language:ssf",
"language:ssg",
"language:ssj",
"language:ssk",
"language:ssl",
"language:ssm",
"language:sso",
"language:ssp",
"language:ssq",
"language:sss",
"language:sst",
"language:ssu",
"language:ssv",
"language:ssw",
"language:ssx",
"language:ssy",
"language:ssz",
"language:ste",
"language:stf",
"language:stg",
"language:sth",
"language:sti",
"language:stj",
"language:stk",
"language:stl",
"language:stn",
"language:sto",
"language:stp",
"language:stq",
"language:str",
"language:sts",
"language:stu",
"language:stv",
"language:stw",
"language:sua",
"language:sub",
"language:suc",
"language:sue",
"language:sug",
"language:sui",
"language:suj",
"language:suk",
"language:sun",
"language:suq",
"language:sur",
"language:sus",
"language:sut",
"language:suv",
"language:suw",
"language:sux",
"language:suy",
"language:suz",
"language:sva",
"language:svb",
"language:svc",
"language:sve",
"language:svs",
"language:swb",
"language:swc",
"language:swe",
"language:swf",
"language:swg",
"language:swh",
"language:swi",
"language:swj",
"language:swm",
"language:swn",
"language:swo",
"language:swp",
"language:swq",
"language:sws",
"language:swt",
"language:swu",
"language:swv",
"language:sww",
"language:swx",
"language:swy",
"language:sxb",
"language:sxe",
"language:sxg",
"language:sxm",
"language:sxn",
"language:sxr",
"language:sxu",
"language:sya",
"language:syb",
"language:syc",
"language:syi",
"language:syk",
"language:syl",
"language:sym",
"language:syn",
"language:syr",
"language:sys",
"language:syw",
"language:sza",
"language:szb",
"language:szc",
"language:szd",
"language:sze",
"language:szg",
"language:szl",
"language:szn",
"language:szp",
"language:szv",
"language:szw",
"language:taa",
"language:tab",
"language:tac",
"language:tad",
"language:tae",
"language:taf",
"language:tag",
"language:tah",
"language:taj",
"language:tal",
"language:tam",
"language:tan",
"language:tao",
"language:tap",
"language:taq",
"language:tar",
"language:tas",
"language:tat",
"language:tau",
"language:tav",
"language:tay",
"language:taz",
"language:tba",
"language:tbc",
"language:tbd",
"language:tbe",
"language:tbf",
"language:tbg",
"language:tbh",
"language:tbi",
"language:tbj",
"language:tbk",
"language:tbl",
"language:tbm",
"language:tbn",
"language:tbo",
"language:tbp",
"language:tbr",
"language:tbs",
"language:tbt",
"language:tbu",
"language:tbv",
"language:tbw",
"language:tbx",
"language:tby",
"language:tbz",
"language:tca",
"language:tcb",
"language:tcc",
"language:tcd",
"language:tce",
"language:tcf",
"language:tcg",
"language:tci",
"language:tck",
"language:tcl",
"language:tcm",
"language:tcn",
"language:tcq",
"language:tcs",
"language:tct",
"language:tcx",
"language:tcy",
"language:tcz",
"language:tda",
"language:tdc",
"language:tdd",
"language:tde",
"language:tdf",
"language:tdg",
"language:tdh",
"language:tdi",
"language:tdj",
"language:tdk",
"language:tdl",
"language:tdm",
"language:tdn",
"language:tdo",
"language:tdr",
"language:tds",
"language:tdt",
"language:tdv",
"language:tdx",
"language:tdy",
"language:tea",
"language:tec",
"language:ted",
"language:tee",
"language:teg",
"language:teh",
"language:tei",
"language:tek",
"language:tel",
"language:tem",
"language:ten",
"language:teo",
"language:tep",
"language:teq",
"language:ter",
"language:tes",
"language:tet",
"language:teu",
"language:tev",
"language:tew",
"language:tex",
"language:tey",
"language:tfn",
"language:tfr",
"language:tft",
"language:tgb",
"language:tgc",
"language:tgd",
"language:tgh",
"language:tgi",
"language:tgj",
"language:tgk",
"language:tgl",
"language:tgn",
"language:tgo",
"language:tgp",
"language:tgq",
"language:tgr",
"language:tgs",
"language:tgt",
"language:tgu",
"language:tgw",
"language:tgx",
"language:tgy",
"language:tha",
"language:thd",
"language:the",
"language:thf",
"language:thk",
"language:thl",
"language:thm",
"language:thp",
"language:thq",
"language:thr",
"language:ths",
"language:tht",
"language:thv",
"language:thz",
"language:tia",
"language:tic",
"language:tif",
"language:tig",
"language:tih",
"language:tii",
"language:tij",
"language:tik",
"language:til",
"language:tim",
"language:tin",
"language:tio",
"language:tip",
"language:tiq",
"language:tir",
"language:tis",
"language:tit",
"language:tiu",
"language:tiv",
"language:tiw",
"language:tix",
"language:tiy",
"language:tiz",
"language:tjg",
"language:tji",
"language:tjm",
"language:tjs",
"language:tju",
"language:tjw",
"language:tkd",
"language:tke",
"language:tkg",
"language:tkl",
"language:tkm",
"language:tkn",
"language:tkp",
"language:tkq",
"language:tkr",
"language:tks",
"language:tku",
"language:tkw",
"language:tlb",
"language:tlc",
"language:tld",
"language:tlf",
"language:tlg",
"language:tlh",
"language:tli",
"language:tlj",
"language:tlk",
"language:tll",
"language:tlm",
"language:tlo",
"language:tlp",
"language:tlq",
"language:tlr",
"language:tls",
"language:tlt",
"language:tlu",
"language:tlv",
"language:tlx",
"language:tly",
"language:tma",
"language:tmb",
"language:tmc",
"language:tmd",
"language:tmf",
"language:tmi",
"language:tmj",
"language:tmm",
"language:tmn",
"language:tmo",
"language:tmq",
"language:tmr",
"language:tms",
"language:tmt",
"language:tmu",
"language:tmw",
"language:tmy",
"language:tmz",
"language:tna",
"language:tnc",
"language:tnh",
"language:tni",
"language:tnk",
"language:tnl",
"language:tnm",
"language:tnn",
"language:tno",
"language:tnp",
"language:tnq",
"language:tnr",
"language:tnt",
"language:tnw",
"language:tnx",
"language:tnz",
"language:tob",
"language:toc",
"language:tof",
"language:tog",
"language:toh",
"language:toi",
"language:toj",
"language:tol",
"language:tom",
"language:ton",
"language:too",
"language:top",
"language:tos",
"language:tow",
"language:tox",
"language:tpa",
"language:tpc",
"language:tpe",
"language:tpf",
"language:tpg",
"language:tpi",
"language:tpj",
"language:tpl",
"language:tpm",
"language:tpn",
"language:tpp",
"language:tpr",
"language:tpt",
"language:tpu",
"language:tpw",
"language:tpx",
"language:tpy",
"language:tpz",
"language:tqb",
"language:tql",
"language:tqo",
"language:tqp",
"language:tqq",
"language:tqu",
"language:tqw",
"language:tra",
"language:trc",
"language:trd",
"language:trf",
"language:trg",
"language:trh",
"language:tri",
"language:trm",
"language:trn",
"language:tro",
"language:trp",
"language:trq",
"language:trr",
"language:trs",
"language:trt",
"language:tru",
"language:trv",
"language:trw",
"language:trx",
"language:trz",
"language:tsa",
"language:tsb",
"language:tsc",
"language:tsd",
"language:tsg",
"language:tsh",
"language:tsi",
"language:tsj",
"language:tsk",
"language:tsl",
"language:tsn",
"language:tso",
"language:tsr",
"language:tsu",
"language:tsv",
"language:tsw",
"language:tsx",
"language:tsz",
"language:tta",
"language:ttc",
"language:ttd",
"language:tte",
"language:ttf",
"language:ttg",
"language:tti",
"language:ttj",
"language:ttk",
"language:ttm",
"language:ttn",
"language:ttq",
"language:ttr",
"language:tts",
"language:ttt",
"language:ttu",
"language:ttv",
"language:ttw",
"language:tty",
"language:tua",
"language:tub",
"language:tuc",
"language:tud",
"language:tue",
"language:tuf",
"language:tug",
"language:tuh",
"language:tui",
"language:tuk",
"language:tul",
"language:tum",
"language:tun",
"language:tuo",
"language:tuq",
"language:tur",
"language:tus",
"language:tuu",
"language:tuv",
"language:tux",
"language:tuy",
"language:tuz",
"language:tva",
"language:tvd",
"language:tve",
"language:tvk",
"language:tvl",
"language:tvm",
"language:tvo",
"language:tvs",
"language:tvu",
"language:tvw",
"language:tvy",
"language:twa",
"language:twb",
"language:twd",
"language:twe",
"language:twf",
"language:twg",
"language:twh",
"language:twi",
"language:twm",
"language:twp",
"language:twq",
"language:twt",
"language:twu",
"language:tww",
"language:twx",
"language:twy",
"language:txa",
"language:txb",
"language:txc",
"language:txe",
"language:txg",
"language:txh",
"language:txi",
"language:txm",
"language:txn",
"language:txo",
"language:txs",
"language:txt",
"language:txu",
"language:txx",
"language:txy",
"language:tya",
"language:tye",
"language:tyi",
"language:tyj",
"language:tyn",
"language:typ",
"language:tyr",
"language:tyu",
"language:tyv",
"language:tyx",
"language:tyz",
"language:tzh",
"language:tzj",
"language:tzl",
"language:tzm",
"language:tzn",
"language:tzo",
"language:tzx",
"language:uam",
"language:uan",
"language:uar",
"language:uba",
"language:ubi",
"language:ubl",
"language:ubr",
"language:ubu",
"language:uby",
"language:uda",
"language:ude",
"language:udi",
"language:udj",
"language:udl",
"language:udm",
"language:udu",
"language:ues",
"language:ufi",
"language:uga",
"language:uge",
"language:ugo",
"language:uha",
"language:uhn",
"language:uig",
"language:uiv",
"language:uji",
"language:uka",
"language:ukg",
"language:ukh",
"language:uki",
"language:ukp",
"language:ukq",
"language:ukr",
"language:uku",
"language:ukv",
"language:ula",
"language:ulc",
"language:ule",
"language:ulf",
"language:uli",
"language:ulk",
"language:ulm",
"language:uln",
"language:ulu",
"language:ulw",
"language:uma",
"language:umb",
"language:umc",
"language:umg",
"language:umm",
"language:umo",
"language:ump",
"language:umr",
"language:ums",
"language:umu",
"language:una",
"language:und",
"language:une",
"language:ung",
"language:unk",
"language:unm",
"language:unn",
"language:unr",
"language:unz",
"language:upv",
"language:ura",
"language:urb",
"language:urc",
"language:urd",
"language:ure",
"language:urf",
"language:urg",
"language:urh",
"language:uri",
"language:urk",
"language:urn",
"language:urp",
"language:urr",
"language:urt",
"language:uru",
"language:urv",
"language:urw",
"language:urx",
"language:ury",
"language:urz",
"language:usa",
"language:ush",
"language:usk",
"language:usp",
"language:usu",
"language:uta",
"language:ute",
"language:utp",
"language:utr",
"language:utu",
"language:uum",
"language:uur",
"language:uuu",
"language:uve",
"language:uvh",
"language:uvl",
"language:uwa",
"language:uya",
"language:uzn",
"language:uzs",
"language:vaa",
"language:vae",
"language:vaf",
"language:vag",
"language:vah",
"language:vai",
"language:vaj",
"language:val",
"language:vam",
"language:van",
"language:vao",
"language:var",
"language:vas",
"language:vau",
"language:vay",
"language:vbb",
"language:vbk",
"language:vec",
"language:ved",
"language:vel",
"language:vem",
"language:ven",
"language:veo",
"language:vep",
"language:ver",
"language:vic",
"language:vie",
"language:vif",
"language:vig",
"language:vil",
"language:vin",
"language:vit",
"language:viv",
"language:vka",
"language:vkj",
"language:vkl",
"language:vkm",
"language:vko",
"language:vkp",
"language:vku",
"language:vlp",
"language:vls",
"language:vma",
"language:vmb",
"language:vme",
"language:vmf",
"language:vmg",
"language:vmi",
"language:vmk",
"language:vml",
"language:vmm",
"language:vmp",
"language:vmr",
"language:vmw",
"language:vmy",
"language:vmz",
"language:vnk",
"language:vnm",
"language:vnp",
"language:vol",
"language:vor",
"language:vot",
"language:vra",
"language:vro",
"language:vrs",
"language:vrt",
"language:vun",
"language:vut",
"language:waa",
"language:wab",
"language:wac",
"language:wad",
"language:wae",
"language:wah",
"language:waj",
"language:wal",
"language:wam",
"language:wan",
"language:wao",
"language:wap",
"language:waq",
"language:war",
"language:was",
"language:wat",
"language:wau",
"language:wav",
"language:waw",
"language:wax",
"language:way",
"language:waz",
"language:wba",
"language:wbb",
"language:wbe",
"language:wbf",
"language:wbh",
"language:wbi",
"language:wbj",
"language:wbk",
"language:wbl",
"language:wbm",
"language:wbp",
"language:wbr",
"language:wbt",
"language:wbv",
"language:wbw",
"language:wca",
"language:wci",
"language:wdd",
"language:wdg",
"language:wdj",
"language:wea",
"language:wed",
"language:weg",
"language:weh",
"language:wei",
"language:weo",
"language:wep",
"language:wer",
"language:wes",
"language:wet",
"language:wew",
"language:wfg",
"language:wga",
"language:wgb",
"language:wgg",
"language:wgi",
"language:wgo",
"language:wgu",
"language:wgy",
"language:wha",
"language:whk",
"language:wib",
"language:wic",
"language:wig",
"language:wih",
"language:wii",
"language:wik",
"language:wil",
"language:wim",
"language:win",
"language:wir",
"language:wiu",
"language:wiv",
"language:wiy",
"language:wja",
"language:wji",
"language:wka",
"language:wkw",
"language:wla",
"language:wlc",
"language:wle",
"language:wlg",
"language:wlh",
"language:wli",
"language:wlk",
"language:wll",
"language:wlm",
"language:wln",
"language:wlo",
"language:wlr",
"language:wls",
"language:wlu",
"language:wlv",
"language:wlw",
"language:wly",
"language:wmb",
"language:wmc",
"language:wmd",
"language:wme",
"language:wmh",
"language:wmi",
"language:wmo",
"language:wms",
"language:wmt",
"language:wmw",
"language:wmx",
"language:wnb",
"language:wnc",
"language:wnd",
"language:wne",
"language:wni",
"language:wnk",
"language:wno",
"language:wnp",
"language:wnu",
"language:wnw",
"language:wny",
"language:woa",
"language:wob",
"language:woc",
"language:wod",
"language:woe",
"language:wof",
"language:wog",
"language:woi",
"language:wok",
"language:wol",
"language:wom",
"language:won",
"language:wos",
"language:wow",
"language:wpc",
"language:wrb",
"language:wrg",
"language:wrh",
"language:wri",
"language:wrk",
"language:wrl",
"language:wrm",
"language:wrn",
"language:wro",
"language:wrp",
"language:wrr",
"language:wrs",
"language:wru",
"language:wrv",
"language:wrw",
"language:wry",
"language:wrz",
"language:wsa",
"language:wsg",
"language:wsi",
"language:wsk",
"language:wsr",
"language:wsv",
"language:wtf",
"language:wth",
"language:wti",
"language:wtm",
"language:wtw",
"language:wub",
"language:wuh",
"language:wul",
"language:wum",
"language:wun",
"language:wut",
"language:wuu",
"language:wuv",
"language:wux",
"language:wuy",
"language:wwa",
"language:wwo",
"language:wwr",
"language:www",
"language:wxa",
"language:wyb",
"language:wyi",
"language:wym",
"language:wyr",
"language:wyy",
"language:xaa",
"language:xab",
"language:xac",
"language:xad",
"language:xai",
"language:xaj",
"language:xal",
"language:xam",
"language:xan",
"language:xap",
"language:xaq",
"language:xar",
"language:xas",
"language:xat",
"language:xau",
"language:xav",
"language:xaw",
"language:xbc",
"language:xbe",
"language:xbg",
"language:xbi",
"language:xbm",
"language:xbn",
"language:xbr",
"language:xbw",
"language:xce",
"language:xcg",
"language:xch",
"language:xcl",
"language:xcm",
"language:xcn",
"language:xco",
"language:xcr",
"language:xct",
"language:xcw",
"language:xcy",
"language:xdc",
"language:xdk",
"language:xdm",
"language:xdy",
"language:xeb",
"language:xed",
"language:xeg",
"language:xel",
"language:xem",
"language:xep",
"language:xer",
"language:xes",
"language:xet",
"language:xeu",
"language:xfa",
"language:xga",
"language:xgd",
"language:xgf",
"language:xgm",
"language:xhd",
"language:xho",
"language:xht",
"language:xhu",
"language:xib",
"language:xii",
"language:xil",
"language:xin",
"language:xir",
"language:xiy",
"language:xjt",
"language:xka",
"language:xkb",
"language:xke",
"language:xkg",
"language:xkl",
"language:xkn",
"language:xkq",
"language:xkr",
"language:xks",
"language:xkv",
"language:xky",
"language:xla",
"language:xlb",
"language:xlc",
"language:xld",
"language:xlg",
"language:xlo",
"language:xlp",
"language:xls",
"language:xlu",
"language:xly",
"language:xmb",
"language:xmf",
"language:xmg",
"language:xmh",
"language:xmk",
"language:xmm",
"language:xmn",
"language:xmr",
"language:xmt",
"language:xmu",
"language:xmv",
"language:xmw",
"language:xmy",
"language:xmz",
"language:xnb",
"language:xng",
"language:xni",
"language:xnm",
"language:xnn",
"language:xno",
"language:xns",
"language:xny",
"language:xnz",
"language:xod",
"language:xog",
"language:xok",
"language:xom",
"language:xon",
"language:xoo",
"language:xop",
"language:xor",
"language:xow",
"language:xpb",
"language:xpd",
"language:xpe",
"language:xpf",
"language:xpg",
"language:xpk",
"language:xpm",
"language:xpq",
"language:xpr",
"language:xpt",
"language:xpu",
"language:xpv",
"language:xpy",
"language:xqt",
"language:xra",
"language:xre",
"language:xri",
"language:xrn",
"language:xrt",
"language:xru",
"language:xrw",
"language:xsa",
"language:xsb",
"language:xsc",
"language:xse",
"language:xsh",
"language:xsi",
"language:xsl",
"language:xsm",
"language:xsp",
"language:xsr",
"language:xss",
"language:xsu",
"language:xsv",
"language:xsy",
"language:xta",
"language:xtc",
"language:xtd",
"language:xte",
"language:xtg",
"language:xtm",
"language:xto",
"language:xtv",
"language:xua",
"language:xub",
"language:xug",
"language:xul",
"language:xum",
"language:xuo",
"language:xup",
"language:xur",
"language:xut",
"language:xuu",
"language:xvi",
"language:xvn",
"language:xvs",
"language:xwa",
"language:xwc",
"language:xwd",
"language:xwg",
"language:xwk",
"language:xwo",
"language:xww",
"language:xxk",
"language:xxt",
"language:xyj",
"language:xyk",
"language:xyt",
"language:xyy",
"language:xzh",
"language:xzp",
"language:yaa",
"language:yab",
"language:yad",
"language:yae",
"language:yaf",
"language:yag",
"language:yah",
"language:yai",
"language:yaj",
"language:yak",
"language:yal",
"language:yam",
"language:yan",
"language:yao",
"language:yap",
"language:yaq",
"language:yar",
"language:yas",
"language:yat",
"language:yau",
"language:yav",
"language:yaw",
"language:yay",
"language:yaz",
"language:yba",
"language:ybb",
"language:ybe",
"language:ybh",
"language:ybi",
"language:ybj",
"language:ybl",
"language:ybm",
"language:ybn",
"language:ybo",
"language:ybx",
"language:yby",
"language:ycl",
"language:ycn",
"language:yda",
"language:ydd",
"language:ydg",
"language:ydk",
"language:yea",
"language:yec",
"language:yee",
"language:yer",
"language:yes",
"language:yet",
"language:yev",
"language:yey",
"language:yga",
"language:ygr",
"language:ygw",
"language:yha",
"language:yhl",
"language:yia",
"language:yig",
"language:yih",
"language:yii",
"language:yij",
"language:yim",
"language:yis",
"language:yix",
"language:yka",
"language:ykg",
"language:yki",
"language:ykm",
"language:yko",
"language:ykr",
"language:ykt",
"language:yky",
"language:yle",
"language:ylg",
"language:yli",
"language:yll",
"language:yln",
"language:ylr",
"language:ylu",
"language:yly",
"language:ymb",
"language:ymc",
"language:yme",
"language:yml",
"language:ymm",
"language:ymo",
"language:ymp",
"language:ynd",
"language:ynk",
"language:ynl",
"language:ynn",
"language:ynq",
"language:yns",
"language:ynu",
"language:yob",
"language:yog",
"language:yoi",
"language:yok",
"language:yol",
"language:yom",
"language:yon",
"language:yor",
"language:yot",
"language:yox",
"language:yoy",
"language:ypg",
"language:yph",
"language:ypz",
"language:yra",
"language:yrb",
"language:yre",
"language:yrk",
"language:yrl",
"language:yrn",
"language:yrw",
"language:ysn",
"language:ysr",
"language:yss",
"language:yua",
"language:yub",
"language:yuc",
"language:yue",
"language:yuf",
"language:yug",
"language:yui",
"language:yuj",
"language:yuk",
"language:yul",
"language:yum",
"language:yun",
"language:yup",
"language:yur",
"language:yut",
"language:yuw",
"language:yux",
"language:yuy",
"language:yuz",
"language:yva",
"language:yvt",
"language:ywa",
"language:ywn",
"language:ywq",
"language:ywr",
"language:ywt",
"language:yww",
"language:yxg",
"language:yxl",
"language:yyr",
"language:yyu",
"language:yzg",
"language:zaa",
"language:zab",
"language:zac",
"language:zad",
"language:zae",
"language:zaf",
"language:zag",
"language:zai",
"language:zaj",
"language:zak",
"language:zal",
"language:zao",
"language:zap",
"language:zaq",
"language:zar",
"language:zav",
"language:zaw",
"language:zay",
"language:zaz",
"language:zba",
"language:zbc",
"language:zbe",
"language:zbl",
"language:zbw",
"language:zca",
"language:zch",
"language:zdj",
"language:zea",
"language:zeg",
"language:zen",
"language:zga",
"language:zgb",
"language:zgh",
"language:zgn",
"language:zgr",
"language:zha",
"language:zhb",
"language:zhd",
"language:zhi",
"language:zhn",
"language:zho",
"language:zhw",
"language:zia",
"language:zik",
"language:zim",
"language:zin",
"language:ziw",
"language:ziz",
"language:zka",
"language:zkb",
"language:zkd",
"language:zkg",
"language:zkk",
"language:zko",
"language:zkp",
"language:zkr",
"language:zkt",
"language:zku",
"language:zlj",
"language:zlm",
"language:zln",
"language:zma",
"language:zmb",
"language:zmc",
"language:zmd",
"language:zme",
"language:zmg",
"language:zmj",
"language:zmk",
"language:zml",
"language:zmm",
"language:zmn",
"language:zmo",
"language:zmp",
"language:zmr",
"language:zmt",
"language:zmu",
"language:zmv",
"language:zmw",
"language:zmx",
"language:zmy",
"language:zmz",
"language:zne",
"language:zng",
"language:zns",
"language:zoc",
"language:zoh",
"language:zom",
"language:zoq",
"language:zor",
"language:zos",
"language:zpc",
"language:zpf",
"language:zpi",
"language:zpl",
"language:zpm",
"language:zpn",
"language:zpo",
"language:zpq",
"language:zpr",
"language:zps",
"language:zpu",
"language:zpv",
"language:zpw",
"language:zpx",
"language:zpz",
"language:zqe",
"language:zro",
"language:zrp",
"language:zrs",
"language:zsa",
"language:zsm",
"language:zsu",
"language:ztg",
"language:ztp",
"language:ztq",
"language:zts",
"language:ztu",
"language:zua",
"language:zuh",
"language:zul",
"language:zum",
"language:zun",
"language:zwa",
"language:zyb",
"language:zyg",
"language:zyj",
"language:zyn",
"language:zzj",
"license:cc0-1.0",
"lexicons",
"region:us"
] | 2024-01-21T17:22:02+00:00 | {"language": ["aaa", "aab", "aac", "aad", "aae", "aag", "aai", "aak", "aal", "aan", "aao", "aap", "aaq", "aar", "aas", "aat", "aau", "aaw", "aax", "aba", "abb", "abe", "abf", "abg", "abi", "abj", "abk", "abl", "abm", "abn", "abo", "abq", "abr", "abs", "abt", "abu", "abv", "abw", "abx", "aby", "abz", "aca", "acd", "ace", "acf", "ach", "aci", "ack", "acl", "acm", "acn", "acp", "acq", "acr", "acs", "act", "acu", "acv", "acw", "acx", "acy", "ada", "add", "ade", "adg", "adh", "adi", "adj", "adl", "adn", "ado", "adq", "adr", "adt", "adw", "adx", "ady", "adz", "aeb", "aec", "aee", "ael", "aem", "aer", "aes", "aew", "aey", "aez", "afb", "afd", "afe", "afi", "afk", "afn", "afo", "afp", "afr", "aft", "afu", "afz", "agb", "agc", "agd", "age", "agf", "agg", "agh", "agj", "agl", "agm", "agn", "ago", "agq", "agr", "ags", "agt", "agu", "agv", "agw", "agx", "aha", "ahb", "ahg", "ahh", "ahi", "ahk", "ahl", "aho", "ahr", "ahs", "aht", "aia", "aib", "aid", "aie", "aif", "aih", "aii", "aik", "ail", "aim", "ain", "aio", "air", "ait", "aiw", "aix", "aja", "ajg", "aji", "ajp", "ajz", "aka", "akb", "akc", "akd", "ake", "akf", "akg", "akh", "aki", "akj", "akk", "akl", "akm", "ako", "akp", "akq", "akr", "aks", "aku", "akv", "akw", "akx", "aky", "akz", "ala", "alc", "ald", "ale", "alf", "alh", "ali", "alj", "alk", "all", "alm", "aln", "alo", "alp", "alq", "alr", "als", "alt", "alu", "alw", "alx", "aly", "alz", "ama", "amc", "ame", "amf", "amg", "amh", "ami", "amj", "amk", "aml", "amm", "amn", "amo", "amp", "amq", "amr", "ams", "amt", "amu", "amv", "amw", "amx", "amy", "amz", "ana", "anb", "anc", "and", "ane", "anf", "ang", "anh", "ani", "anj", "ank", "anm", "ann", "ano", "anp", "anq", "ant", "anu", "anv", "anw", "anx", "any", "anz", "aoa", "aoc", "aod", "aof", "aog", "aoi", "aoj", "aol", "aom", "aon", "aor", "aos", "aot", "aou", "aox", "aoz", "apb", "apc", "apd", "ape", "aph", "api", "apj", "apk", "apl", "apm", "apn", "apo", "app", "apq", "apr", "apt", "apu", "apw", "apx", "apy", "apz", "aqc", "aqd", "aqg", "aqm", "aqp", "aqt", "aqz", "ara", "arb", "arc", "ard", "are", "arg", "arh", "ari", "ark", "arl", "arn", "aro", "arp", "arq", "arr", "ars", "arv", "arw", "arx", "ary", "arz", "asa", "asb", "asc", "ase", "asg", "ash", "asi", "asj", "ask", "asl", "asm", "asn", "aso", "asr", "ass", "ast", "asu", "asv", "asx", "asy", "asz", "ata", "atb", "atc", "atd", "ate", "ati", "atj", "atk", "atm", "ato", "atq", "atr", "ats", "att", "atv", "atw", "atx", "aty", "atz", "aua", "auc", "aud", "auh", "aui", "auj", "auk", "aul", "aum", "aun", "aup", "aut", "auu", "auw", "auy", "auz", "ava", "avb", "avd", "ave", "avi", "avk", "avm", "avn", "avt", "avu", "avv", "awa", "awb", "awc", "awe", "awg", "awh", "awi", "awk", "awm", "awn", "awo", "awr", "aws", "awt", "awv", "awx", "awy", "axb", "axe", "axg", "axk", "axl", "axm", "axx", "aya", "ayb", "ayc", "ayd", "aye", "ayg", "ayh", "ayi", "ayl", "aym", "ayn", "ayo", "ayr", "ayu", "ayz", "aza", "azb", "azd", "azg", "azj", "azm", "azn", "azo", "azz", "baa", "bab", "bae", "baf", "bag", "baj", "bak", "bal", "bam", "ban", "bao", "bap", "bar", "bas", "bau", "bav", "baw", "bax", "bba", "bbb", "bbc", "bbd", "bbe", "bbf", "bbh", "bbi", "bbj", "bbk", "bbl", "bbn", "bbp", "bbq", "bbr", "bbs", "bbt", "bbu", "bbv", "bbw", "bby", "bca", "bcc", "bcd", "bce", "bcf", "bcg", "bch", "bci", "bcj", "bck", "bcl", "bcm", "bcn", "bco", "bcp", "bcq", "bcr", "bcs", "bcu", "bcv", "bcw", "bcy", "bcz", "bda", "bdb", "bdc", "bdd", "bde", "bdg", "bdh", "bdi", "bdj", "bdk", "bdl", "bdm", "bdn", "bdp", "bdq", "bdr", "bds", "bdu", "bdw", "bdx", "bdy", "bea", "beb", "bec", "bed", "bee", "bef", "beg", "beh", "bei", "bej", "bek", "bel", "bem", "ben", "beo", "bep", "beq", "bet", "beu", "bev", "bew", "bex", "bey", "bez", "bfa", "bfc", "bfd", "bff", "bfg", "bfh", "bfj", "bfl", "bfm", "bfn", "bfp", "bfq", "bfs", "bft", "bfu", "bfw", "bfy", "bfz", "bgc", "bgd", "bgf", "bgg", "bgi", "bgj", "bgk", "bgl", "bgn", "bgo", "bgp", "bgq", "bgr", "bgs", "bgt", "bgu", "bgv", "bgz", "bhb", "bhc", "bhd", "bhf", "bhg", "bhh", "bhj", "bhl", "bhn", "bho", "bhp", "bhq", "bhr", "bht", "bhw", "bhy", "bhz", "bia", "bib", "bid", "bie", "bif", "big", "bil", "bim", "bin", "bio", "biq", "bir", "bis", "bit", "biu", "biv", "biw", "biy", "biz", "bja", "bjb", "bjc", "bje", "bjf", "bjg", "bjh", "bji", "bjk", "bjl", "bjm", "bjn", "bjp", "bjr", "bjs", "bjt", "bjv", "bjw", "bjy", "bjz", "bkc", "bkd", "bkh", "bki", "bkj", "bkk", "bkl", "bkm", "bkn", "bko", "bkq", "bkr", "bks", "bku", "bkv", "bkw", "bkx", "bky", "bkz", "bla", "blb", "blc", "bld", "ble", "blf", "blh", "bli", "blj", "blk", "bll", "blm", "bln", "blp", "blq", "blr", "bls", "blt", "blv", "blw", "blx", "bly", "blz", "bmb", "bmc", "bmd", "bme", "bmf", "bmg", "bmh", "bmi", "bmk", "bmm", "bmn", "bmo", "bmr", "bmt", "bmu", "bmv", "bmx", "bmz", "bna", "bnb", "bnc", "bnd", "bne", "bnf", "bng", "bni", "bnj", "bnk", "bnl", "bnm", "bnn", "bno", "bnp", "bnq", "bnr", "bns", "bnv", "bnw", "bnx", "bny", "bnz", "boa", "bob", "bod", "boe", "bof", "boh", "boi", "boj", "bol", "bom", "bon", "bop", "boq", "bor", "bos", "bot", "bou", "bov", "box", "boy", "boz", "bpa", "bpg", "bph", "bpi", "bpj", "bpk", "bpm", "bpn", "bpp", "bpq", "bpr", "bps", "bpu", "bpv", "bpw", "bpy", "bpz", "bqc", "bqg", "bqi", "bql", "bqp", "bqq", "bqr", "bqs", "bqt", "bqv", "bqw", "bqz", "brb", "brc", "brd", "bre", "brf", "brg", "brh", "bri", "brj", "brk", "brl", "brm", "brn", "brp", "brq", "brr", "brt", "bru", "brv", "brx", "brz", "bsb", "bsc", "bse", "bsf", "bsg", "bsh", "bsi", "bsk", "bsl", "bsm", "bsn", "bsp", "bsq", "bss", "bst", "bsu", "bsw", "bsx", "bsy", "bta", "btc", "btd", "bte", "btf", "bth", "btj", "btm", "btn", "bto", "btp", "btq", "btr", "bts", "btt", "btu", "btv", "btw", "btx", "bty", "btz", "bua", "bub", "buc", "bud", "bue", "buf", "bug", "buh", "buk", "bul", "bum", "bun", "buo", "bup", "buq", "bus", "but", "buw", "bux", "buy", "bva", "bvb", "bvc", "bvd", "bvg", "bvi", "bvj", "bvk", "bvm", "bvn", "bvo", "bvq", "bvr", "bvv", "bvw", "bvx", "bvz", "bwa", "bwb", "bwd", "bwe", "bwf", "bwg", "bwi", "bwk", "bwl", "bwm", "bwn", "bwo", "bwp", "bwq", "bwr", "bws", "bwt", "bwu", "bww", "bwx", "bxa", "bxb", "bxd", "bxe", "bxf", "bxg", "bxh", "bxi", "bxj", "bxk", "bxl", "bxm", "bxn", "bxq", "bxr", "bxu", "bxv", "bxw", "bxz", "bya", "byd", "bye", "byf", "byi", "byk", "byl", "bym", "byn", "byo", "byp", "byq", "byr", "bys", "byt", "byv", "byw", "byx", "byz", "bza", "bzb", "bzc", "bzd", "bze", "bzf", "bzg", "bzh", "bzi", "bzj", "bzk", "bzl", "bzn", "bzp", "bzq", "bzr", "bzt", "bzu", "bzv", "bzw", "bzx", "bzy", "bzz", "caa", "cab", "cac", "cad", "cae", "caf", "cag", "cah", "caj", "cak", "cal", "cam", "can", "cao", "cap", "caq", "car", "cas", "cat", "cav", "caw", "cax", "cay", "caz", "cbb", "cbc", "cbd", "cbg", "cbi", "cbj", "cbk", "cbn", "cbo", "cbr", "cbs", "cbt", "cbu", "cbv", "cby", "ccc", "cce", "ccg", "cch", "ccj", "cco", "ccp", "ccr", "cdf", "cdh", "cdj", "cdm", "cdn", "cdo", "cdr", "cdy", "cdz", "cea", "ceb", "ceg", "ces", "cet", "cfa", "cfd", "cfg", "cfm", "cgc", "cgg", "cha", "chb", "chc", "chd", "che", "chf", "chg", "chh", "chj", "chk", "chl", "chm", "chn", "cho", "chp", "chq", "chr", "cht", "chu", "chv", "chw", "chx", "chy", "chz", "cia", "cic", "cid", "cie", "cik", "cim", "cin", "cip", "cir", "ciw", "ciy", "cja", "cje", "cjh", "cji", "cjk", "cjm", "cjn", "cjo", "cjp", "cjs", "cjv", "cjy", "ckb", "ckh", "ckl", "cko", "ckq", "cks", "ckt", "cku", "ckv", "ckx", "cla", "clc", "cld", "cle", "clh", "cli", "clk", "cll", "clm", "clo", "clw", "cly", "cme", "cmg", "cmi", "cml", "cmn", "cmo", "cnb", "cng", "cnh", "cni", "cnk", "cnl", "cnp", "cnr", "cns", "cnt", "cnu", "cnx", "cob", "coc", "cod", "coe", "cof", "cog", "coh", "coj", "cok", "col", "com", "con", "coo", "cop", "coq", "cor", "cos", "cot", "cou", "cow", "coz", "cpa", "cpb", "cpc", "cpg", "cpi", "cpn", "cps", "cpu", "cpx", "cqd", "cra", "crb", "crc", "crd", "cre", "crg", "crh", "cri", "crj", "crk", "crl", "crm", "crn", "cro", "crq", "crr", "crs", "crt", "crw", "crx", "cry", "crz", "csa", "csb", "csh", "csi", "csm", "cso", "csp", "css", "cst", "csw", "csy", "csz", "cta", "ctc", "ctd", "cte", "ctg", "ctl", "ctm", "ctn", "cto", "ctp", "cts", "ctu", "ctz", "cua", "cub", "cuc", "cuh", "cui", "cuj", "cuk", "cul", "cuo", "cup", "cuq", "cur", "cut", "cuu", "cuv", "cux", "cuy", "cvg", "cvn", "cwd", "cwe", "cwg", "cwt", "cya", "cyb", "cym", "cyo", "czh", "czk", "czn", "czo", "czt", "daa", "dac", "dad", "dag", "dah", "dai", "daj", "dak", "dal", "dam", "dan", "daq", "dar", "dau", "dav", "daw", "dax", "dba", "dbb", "dbd", "dbe", "dbf", "dbg", "dbi", "dbj", "dbl", "dbm", "dbn", "dbp", "dbq", "dbt", "dbu", "dbw", "dby", "dcc", "dcr", "dda", "ddd", "ddg", "ddi", "ddj", "ddn", "ddo", "dds", "ddw", "dec", "ded", "dee", "deg", "dei", "del", "dem", "den", "dep", "der", "des", "deu", "dev", "dga", "dgb", "dgc", "dgd", "dge", "dgg", "dgh", "dgi", "dgl", "dgo", "dgr", "dgt", "dgw", "dgz", "dhd", "dhg", "dhi", "dhl", "dhr", "dhs", "dhu", "dhv", "dia", "dib", "dic", "did", "dif", "dig", "dih", "dii", "dij", "dik", "dil", "dim", "din", "dio", "dip", "diq", "dir", "dis", "diu", "div", "diw", "dix", "diz", "dja", "djb", "djc", "djd", "dje", "dji", "djj", "djk", "djm", "djn", "djr", "dju", "djw", "dkr", "dks", "dlg", "dlk", "dlm", "dma", "dmb", "dmc", "dmd", "dme", "dmf", "dmg", "dmk", "dml", "dmm", "dmo", "dmr", "dms", "dmu", "dmv", "dmw", "dmy", "dna", "dnd", "dne", "dng", "dni", "dnj", "dnn", "dnr", "dnt", "dnu", "dnw", "dny", "doa", "dob", "doc", "doe", "dof", "doh", "doi", "dok", "dol", "don", "doo", "dor", "dos", "dot", "dow", "dox", "doy", "doz", "dpp", "drb", "drd", "dre", "drg", "dri", "drl", "drn", "dro", "drs", "dru", "dry", "dsb", "dsh", "dsn", "dso", "dsq", "dta", "dtb", "dtd", "dth", "dti", "dtk", "dtm", "dto", "dtp", "dtr", "dts", "dtt", "dtu", "dty", "dua", "duc", "due", "duf", "dug", "duk", "dum", "dun", "duo", "dup", "duq", "dur", "dus", "duu", "duv", "duw", "dux", "dva", "dwa", "dwk", "dwr", "dws", "dwu", "dww", "dya", "dyb", "dyd", "dyi", "dym", "dyn", "dyo", "dyu", "dyy", "dze", "dzg", "dzo", "ebg", "ebk", "ebo", "ebr", "ebu", "ecy", "eee", "efa", "efi", "ega", "egl", "ego", "egy", "ehu", "eip", "eit", "eiv", "eja", "eka", "eke", "ekg", "ekk", "ekl", "ekm", "eko", "ekp", "ekr", "eky", "ele", "eli", "elk", "ell", "elm", "elo", "elx", "ema", "emb", "eme", "emg", "emi", "emk", "emn", "emp", "ems", "emw", "emy", "ena", "enb", "enc", "end", "enf", "eng", "enh", "enl", "enm", "enn", "eno", "enq", "enr", "enw", "enx", "eot", "epi", "epo", "erg", "erh", "eri", "erk", "ero", "err", "ers", "ert", "erw", "ese", "esg", "esh", "esi", "esk", "eso", "esq", "ess", "esu", "etb", "etc", "etn", "eto", "etr", "ets", "ett", "etu", "etx", "etz", "eus", "eve", "evh", "evn", "ewe", "ewo", "ext", "eya", "eyo", "faa", "fab", "fad", "faf", "fag", "fai", "faj", "fak", "fal", "fam", "fan", "fao", "fap", "far", "fat", "fau", "fax", "fay", "fbl", "fer", "ffm", "fgr", "fia", "fie", "fij", "fil", "fin", "fip", "fir", "fit", "fkk", "fkv", "fla", "flh", "fli", "fll", "fln", "flr", "fly", "fmp", "fng", "fni", "fod", "foi", "fom", "fon", "for", "fos", "fpe", "fqs", "fra", "frc", "frd", "frk", "frm", "fro", "frp", "frq", "frr", "frt", "fry", "fub", "fuc", "fud", "fuf", "fuh", "fuj", "ful", "fun", "fur", "fut", "fuu", "fuv", "fuy", "fvr", "fwa", "gaa", "gab", "gac", "gad", "gae", "gaf", "gag", "gah", "gai", "gaj", "gal", "gam", "gan", "gao", "gap", "gaq", "gar", "gat", "gaw", "gax", "gay", "gaz", "gba", "gbb", "gbd", "gbe", "gbg", "gbi", "gbj", "gbm", "gbp", "gbr", "gbu", "gbw", "gbx", "gby", "gbz", "gcc", "gcd", "gce", "gcf", "gcl", "gcn", "gcr", "gdb", "gdc", "gdd", "gde", "gdf", "gdg", "gdh", "gdi", "gdj", "gdl", "gdm", "gdn", "gdo", "gdq", "gdr", "gdu", "gdx", "gea", "geb", "ged", "geg", "geh", "gei", "gej", "gek", "gel", "geq", "ges", "gev", "gew", "gey", "gez", "gfk", "gft", "gga", "ggb", "ggd", "gge", "ggk", "ggl", "ggt", "ggu", "ggw", "gha", "ghk", "ghl", "ghn", "gho", "ghs", "ght", "gia", "gic", "gid", "gil", "gim", "gin", "gip", "giq", "gir", "gis", "git", "giw", "gix", "giy", "giz", "gjm", "gjn", "gju", "gke", "gkn", "gkp", "gku", "gla", "glc", "gld", "gle", "glg", "glh", "glj", "glk", "glo", "glv", "glw", "gly", "gma", "gmb", "gmh", "gml", "gmm", "gmu", "gmv", "gmy", "gna", "gnc", "gnd", "gne", "gng", "gnh", "gni", "gnk", "gnl", "gnm", "gnn", "gno", "gnq", "gnr", "gnu", "goa", "gob", "goc", "god", "gof", "gog", "goh", "goi", "gol", "gom", "gon", "gop", "gor", "gos", "got", "gou", "gow", "gox", "gpa", "gpe", "gpn", "gqa", "gqi", "gqn", "gqr", "gqu", "gra", "grb", "grc", "grd", "grg", "grh", "gri", "grn", "gro", "grq", "grr", "grs", "grt", "gru", "grx", "gsl", "gsn", "gso", "gsp", "gsw", "gta", "gtu", "gua", "gub", "guc", "gud", "gue", "guf", "gug", "guh", "gui", "guj", "guk", "gul", "gum", "gun", "guo", "gup", "guq", "gur", "gut", "guu", "gux", "guz", "gva", "gvc", "gve", "gvf", "gvj", "gvl", "gvm", "gvn", "gvo", "gvp", "gvr", "gvs", "gwa", "gwc", "gwd", "gwe", "gwf", "gwi", "gwj", "gwn", "gwr", "gwt", "gwu", "gww", "gwx", "gxx", "gya", "gyb", "gyd", "gye", "gyl", "gym", "gyn", "gyo", "gyr", "gyy", "gza", "gzi", "gzn", "haa", "hac", "had", "hae", "hag", "hah", "hai", "haj", "hak", "hal", "ham", "han", "hao", "haq", "har", "has", "hat", "hau", "hav", "haw", "hax", "hay", "haz", "hbb", "hbn", "hbo", "hbs", "hbu", "hca", "hch", "hdn", "hdy", "hea", "heb", "hed", "heg", "heh", "hei", "hem", "her", "hgm", "hia", "hib", "hid", "hif", "hig", "hih", "hik", "hil", "hin", "hio", "hit", "hiw", "hix", "hka", "hke", "hkk", "hla", "hlb", "hlu", "hmb", "hmd", "hml", "hmn", "hmo", "hmq", "hmr", "hms", "hmt", "hmu", "hmv", "hna", "hnd", "hne", "hng", "hnh", "hni", "hnj", "hnn", "hno", "hns", "hnu", "hoa", "hob", "hoc", "hod", "hoe", "hoh", "hoi", "hoj", "hol", "hoo", "hop", "hot", "hov", "how", "hoz", "hpo", "hra", "hrc", "hre", "hrk", "hro", "hrt", "hru", "hrv", "hrw", "hrx", "hrz", "hsb", "hsn", "hss", "hti", "hto", "hts", "htu", "htx", "hub", "huc", "hud", "hue", "huf", "hug", "huh", "hui", "huk", "hul", "hum", "hun", "huo", "hup", "huq", "hur", "hus", "hut", "huu", "huv", "huw", "hux", "huz", "hvc", "hve", "hvn", "hvv", "hwa", "hwc", "hwo", "hya", "hye", "hyw", "iai", "ian", "iar", "iba", "ibb", "ibd", "ibe", "ibg", "ibl", "ibn", "ibo", "ibr", "ibu", "iby", "ica", "ich", "ida", "idb", "idc", "idd", "idi", "ido", "idr", "idt", "idu", "ifa", "ifb", "ife", "iff", "ifk", "ifm", "ifu", "ify", "igb", "ige", "igl", "igm", "ign", "igo", "igs", "ihp", "iii", "iin", "ijc", "ije", "ijj", "ijn", "ijs", "ike", "iki", "ikl", "iko", "ikt", "iku", "ikw", "ikx", "ikz", "ilb", "ile", "ilk", "ilo", "ilp", "ilu", "ilv", "imi", "iml", "imn", "imo", "imr", "ims", "ina", "inb", "ind", "ing", "inh", "inj", "inm", "inn", "ino", "inp", "int", "inz", "ior", "iou", "iow", "ipi", "ipk", "ipo", "iqu", "ire", "irh", "iri", "irk", "irn", "iru", "irx", "iry", "isa", "isc", "isd", "isi", "isk", "isl", "iso", "ist", "isu", "ita", "itb", "itd", "ite", "iti", "itk", "itl", "itm", "ito", "itr", "itv", "itw", "itx", "itz", "ium", "ivb", "ivv", "iwm", "iws", "ixc", "ixl", "iyo", "iyx", "izh", "izr", "izz", "jaa", "jab", "jac", "jae", "jaf", "jah", "jaj", "jak", "jal", "jam", "jao", "jaq", "jau", "jav", "jax", "jay", "jaz", "jbe", "jbi", "jbj", "jbk", "jbn", "jbo", "jbt", "jbu", "jct", "jdt", "jeb", "jee", "jeh", "jei", "jek", "jel", "jen", "jer", "jet", "jeu", "jge", "jgk", "jgo", "jhi", "jia", "jib", "jic", "jid", "jig", "jii", "jil", "jim", "jio", "jit", "jiu", "jiv", "jjr", "jka", "jkm", "jko", "jkp", "jku", "jle", "jma", "jmb", "jmc", "jmd", "jmi", "jml", "jmr", "jms", "jmx", "jna", "jng", "jni", "jnj", "jns", "job", "jod", "jor", "jow", "jpa", "jpn", "jqr", "jra", "jrb", "jrr", "jru", "jua", "jub", "juc", "jud", "juh", "juk", "jul", "jum", "jun", "juo", "jup", "jur", "jut", "juu", "juw", "juy", "jvn", "jya", "kaa", "kab", "kac", "kad", "kae", "kag", "kah", "kai", "kaj", "kak", "kal", "kam", "kan", "kao", "kap", "kaq", "kas", "kat", "kau", "kav", "kaw", "kay", "kaz", "kbb", "kbc", "kbd", "kbe", "kbh", "kbi", "kbj", "kbk", "kbl", "kbm", "kbn", "kbo", "kbp", "kbq", "kbr", "kbs", "kbt", "kbv", "kbw", "kbx", "kby", "kbz", "kca", "kcb", "kcd", "kcf", "kcg", "kci", "kcj", "kck", "kcl", "kcm", "kcn", "kco", "kcp", "kcr", "kct", "kcu", "kcv", "kcw", "kcx", "kcy", "kda", "kdc", "kdd", "kde", "kdf", "kdh", "kdi", "kdj", "kdk", "kdl", "kdm", "kdn", "kdp", "kdq", "kdr", "kdt", "kdu", "kdw", "kdx", "kdz", "kea", "keb", "kec", "ked", "kee", "kef", "keg", "kei", "kek", "kel", "kem", "ken", "keo", "ker", "kes", "ket", "keu", "kew", "kex", "kez", "kfa", "kfb", "kfc", "kfd", "kfe", "kff", "kfi", "kfk", "kfm", "kfn", "kfo", "kfp", "kfq", "kfr", "kfy", "kfz", "kga", "kgb", "kge", "kgf", "kgg", "kgj", "kgk", "kgl", "kgo", "kgp", "kgq", "kgr", "kgs", "kgt", "kgu", "kgv", "kgw", "kgx", "kgy", "kha", "khb", "khc", "khe", "khf", "khg", "khh", "khj", "khk", "khl", "khm", "khn", "kho", "khp", "khq", "khr", "khs", "kht", "khu", "khv", "khw", "khy", "khz", "kia", "kib", "kic", "kid", "kie", "kig", "kih", "kii", "kij", "kik", "kil", "kim", "kin", "kio", "kiq", "kir", "kis", "kit", "kiu", "kiv", "kiw", "kiy", "kiz", "kja", "kjb", "kjc", "kjd", "kje", "kjg", "kjh", "kji", "kjj", "kjk", "kjl", "kjm", "kjn", "kjp", "kjq", "kjr", "kjs", "kju", "kjx", "kjy", "kjz", "kka", "kkb", "kkc", "kke", "kkh", "kki", "kkj", "kkk", "kkl", "kkm", "kko", "kkp", "kks", "kkw", "kky", "kkz", "kla", "klb", "klc", "kld", "kle", "klg", "klh", "kli", "klj", "kll", "klm", "kln", "klp", "klq", "klr", "kls", "klt", "klu", "klv", "klw", "klx", "kly", "klz", "kma", "kmb", "kmc", "kmd", "kmf", "kmg", "kmh", "kmi", "kmk", "kml", "kmm", "kmn", "kmo", "kmq", "kmr", "kms", "kmt", "kmu", "kmv", "kmw", "kmx", "kmz", "kna", "knb", "knc", "knd", "kne", "knf", "kng", "kni", "knj", "knk", "knm", "knn", "kno", "knp", "knq", "knr", "kns", "knt", "knv", "knw", "knx", "kny", "kod", "koe", "kof", "kog", "koh", "koi", "kok", "kol", "kon", "koo", "kop", "koq", "kor", "kos", "kot", "kou", "kov", "koy", "koz", "kpc", "kpe", "kpf", "kpg", "kph", "kpj", "kpk", "kpl", "kpm", "kpn", "kpo", "kpq", "kpr", "kps", "kpt", "kpu", "kpv", "kpw", "kpx", "kpy", "kpz", "kqa", "kqb", "kqc", "kqe", "kqf", "kqi", "kqj", "kql", "kqn", "kqo", "kqq", "kqr", "kqs", "kqt", "kqu", "kqv", "kqw", "kqx", "kqy", "kqz", "krb", "krc", "krd", "kre", "krf", "krh", "kri", "krj", "krk", "krl", "krp", "krr", "krs", "kru", "krx", "kry", "krz", "ksb", "ksc", "ksd", "kse", "ksf", "ksg", "ksh", "ksi", "ksj", "ksk", "ksl", "ksm", "ksn", "ksp", "ksq", "ksr", "kss", "kst", "ksv", "ksw", "ksx", "ksz", "ktb", "ktd", "kte", "ktg", "kth", "kti", "ktj", "ktk", "ktm", "ktn", "kto", "ktp", "kts", "ktt", "ktu", "ktv", "ktw", "ktx", "ktz", "kua", "kub", "kud", "kue", "kug", "kui", "kuj", "kul", "kum", "kun", "kuo", "kup", "kuq", "kus", "kut", "kuu", "kux", "kuy", "kuz", "kva", "kvb", "kvc", "kvd", "kve", "kvf", "kvg", "kvh", "kvj", "kvm", "kvn", "kvo", "kvq", "kvr", "kvu", "kvv", "kvw", "kvy", "kwa", "kwb", "kwd", "kwe", "kwf", "kwg", "kwh", "kwi", "kwj", "kwk", "kwl", "kwn", "kwo", "kwp", "kws", "kwv", "kwy", "kwz", "kxa", "kxb", "kxc", "kxd", "kxf", "kxh", "kxi", "kxj", "kxm", "kxn", "kxo", "kxv", "kxw", "kxz", "kya", "kyc", "kye", "kyf", "kyh", "kyi", "kyj", "kyk", "kyl", "kym", "kyo", "kyq", "kyr", "kys", "kyt", "kyu", "kyx", "kyz", "kza", "kzb", "kzc", "kzd", "kze", "kzf", "kzg", "kzi", "kzk", "kzl", "kzm", "kzn", "kzo", "kzp", "kzq", "kzr", "kzs", "kzu", "kzw", "kzx", "kzz", "laa", "lac", "lad", "lae", "laf", "lag", "lah", "lai", "laj", "lam", "lan", "lao", "lap", "laq", "lar", "las", "lat", "lau", "law", "lax", "laz", "lbb", "lbc", "lbe", "lbf", "lbj", "lbk", "lbl", "lbn", "lbo", "lbq", "lbr", "lbt", "lbu", "lbv", "lbw", "lbx", "lby", "lbz", "lcc", "lch", "lcm", "lcp", "lcq", "lcs", "lda", "ldb", "ldi", "ldj", "ldk", "ldl", "ldm", "ldn", "lea", "leb", "lec", "led", "lee", "lef", "leh", "lei", "lej", "lek", "lem", "len", "lep", "leq", "ler", "les", "let", "leu", "lev", "lew", "lex", "lez", "lfa", "lfn", "lga", "lgb", "lgg", "lgh", "lgi", "lgk", "lgl", "lgn", "lgq", "lgr", "lgt", "lgu", "lha", "lhh", "lhi", "lhm", "lhn", "lhs", "lht", "lhu", "lia", "lib", "lic", "lid", "lif", "lig", "lih", "lij", "lil", "lim", "lin", "lio", "lip", "liq", "lir", "lis", "lit", "liu", "liv", "lix", "liy", "liz", "lja", "lje", "lji", "ljl", "ljp", "lka", "lkb", "lkd", "lke", "lkh", "lki", "lkl", "lkn", "lkr", "lkt", "lku", "lky", "lla", "llc", "lld", "lle", "llf", "llg", "lli", "llk", "lll", "llm", "lln", "llp", "llu", "lma", "lmb", "lmc", "lme", "lmg", "lmh", "lmk", "lml", "lmn", "lmo", "lmp", "lmr", "lmu", "lmw", "lmx", "lmy", "lna", "lnb", "lnd", "lnh", "lnj", "lnl", "lnn", "lns", "lnu", "loa", "lob", "loc", "loe", "lof", "log", "loh", "loi", "loj", "lok", "lol", "lom", "lon", "loo", "lop", "lor", "los", "lot", "lou", "loy", "loz", "lpa", "lpe", "lra", "lrc", "lre", "lrg", "lrl", "lro", "lrv", "lrz", "lsa", "lsd", "lse", "lsh", "lsi", "lsm", "lsr", "ltc", "ltg", "lti", "ltn", "ltz", "lua", "lub", "luc", "lud", "lue", "luf", "lug", "lui", "luj", "lul", "lum", "lun", "luo", "lup", "luq", "lur", "lus", "lut", "luw", "luy", "lva", "lvk", "lvs", "lwh", "lwl", "lwo", "lww", "lyg", "lyn", "lzh", "lzl", "lzz", "maa", "mad", "mae", "maf", "mag", "mah", "mai", "maj", "mak", "mal", "mam", "man", "maq", "mar", "mas", "mat", "mau", "mav", "maw", "max", "maz", "mbb", "mbc", "mbd", "mbe", "mbf", "mbh", "mbi", "mbj", "mbl", "mbm", "mbn", "mbo", "mbp", "mbq", "mbr", "mbs", "mbt", "mbu", "mbv", "mbw", "mbx", "mbz", "mca", "mcb", "mcc", "mcd", "mcf", "mcg", "mch", "mci", "mcj", "mck", "mcm", "mcn", "mco", "mcp", "mcq", "mcr", "mcs", "mct", "mcu", "mcv", "mcw", "mcx", "mcy", "mcz", "mda", "mdb", "mdc", "mdd", "mde", "mdf", "mdh", "mdi", "mdj", "mdm", "mdp", "mdq", "mdr", "mds", "mdt", "mdv", "mdw", "mdx", "mdy", "mdz", "mea", "meb", "mec", "med", "mee", "mef", "mei", "mej", "mek", "mel", "mem", "men", "meo", "mep", "meq", "mer", "met", "meu", "mev", "mew", "mey", "mez", "mfa", "mfb", "mfc", "mfd", "mfe", "mff", "mfg", "mfh", "mfi", "mfj", "mfl", "mfm", "mfn", "mfo", "mfp", "mfq", "mfr", "mft", "mfu", "mfv", "mfw", "mfx", "mfy", "mfz", "mga", "mgb", "mgc", "mgd", "mge", "mgf", "mgh", "mgi", "mgk", "mgl", "mgm", "mgo", "mgp", "mgq", "mgr", "mgs", "mgu", "mgv", "mgw", "mgy", "mgz", "mha", "mhb", "mhc", "mhd", "mhe", "mhf", "mhg", "mhi", "mhj", "mhk", "mhl", "mhn", "mho", "mhq", "mhr", "mhs", "mht", "mhu", "mhw", "mhx", "mhy", "mhz", "mia", "mib", "mic", "mid", "mie", "mif", "mig", "mih", "mij", "mik", "mil", "mim", "min", "mio", "mip", "miq", "mir", "mit", "miu", "miw", "mix", "mjc", "mjd", "mjg", "mjh", "mji", "mjj", "mjk", "mjm", "mjs", "mjt", "mjw", "mjx", "mjy", "mka", "mkc", "mkd", "mke", "mkf", "mkg", "mki", "mkj", "mkl", "mkm", "mkn", "mkp", "mkq", "mkr", "mks", "mkv", "mkw", "mky", "mkz", "mla", "mlc", "mle", "mlf", "mlg", "mlh", "mlj", "mlk", "mll", "mlm", "mln", "mlp", "mlq", "mlr", "mls", "mlt", "mlu", "mlv", "mlw", "mlx", "mma", "mmb", "mmd", "mme", "mmf", "mmg", "mmh", "mmi", "mmm", "mmn", "mmo", "mmp", "mmq", "mmr", "mmt", "mmu", "mmw", "mmx", "mmy", "mmz", "mna", "mnb", "mnc", "mnd", "mne", "mnf", "mng", "mnh", "mni", "mnj", "mnk", "mnl", "mnm", "mnp", "mnq", "mnr", "mns", "mnv", "mnw", "mnx", "mny", "mnz", "moa", "moc", "mod", "moe", "mog", "moh", "moi", "moj", "mok", "mom", "mop", "moq", "mor", "mos", "mot", "mou", "mov", "mox", "moy", "moz", "mpa", "mpb", "mpc", "mpd", "mpe", "mpg", "mph", "mpi", "mpj", "mpk", "mpl", "mpm", "mpn", "mpp", "mpq", "mpr", "mps", "mpt", "mpu", "mpv", "mpw", "mpx", "mpy", "mpz", "mqa", "mqb", "mqc", "mqe", "mqf", "mqj", "mqk", "mql", "mqm", "mqn", "mqo", "mqp", "mqr", "mqs", "mqu", "mqv", "mqw", "mqx", "mqy", "mqz", "mra", "mrb", "mrc", "mrf", "mrg", "mrh", "mri", "mrj", "mrk", "mrl", "mrm", "mrn", "mro", "mrp", "mrq", "mrr", "mrs", "mrt", "mru", "mrv", "mrw", "mrx", "mry", "mrz", "msa", "msb", "mse", "msf", "msg", "msh", "msj", "msk", "msl", "msm", "msn", "mso", "mss", "msu", "msw", "msx", "msy", "msz", "mta", "mtc", "mtd", "mte", "mtf", "mtg", "mth", "mti", "mtj", "mtk", "mtl", "mtm", "mtn", "mto", "mtp", "mtq", "mtr", "mts", "mtt", "mtu", "mtv", "mty", "mua", "mub", "muc", "mue", "mug", "muh", "mui", "muk", "mul", "muo", "mup", "mur", "mus", "muu", "mux", "muy", "muz", "mva", "mvb", "mvd", "mve", "mvf", "mvi", "mvn", "mvo", "mvp", "mvq", "mvr", "mvt", "mvu", "mvv", "mvx", "mvy", "mvz", "mwa", "mwb", "mwc", "mwe", "mwf", "mwg", "mwh", "mwi", "mwl", "mwm", "mwn", "mwo", "mwp", "mwq", "mwr", "mwt", "mwv", "mww", "mxb", "mxc", "mxd", "mxe", "mxf", "mxg", "mxi", "mxj", "mxk", "mxm", "mxn", "mxp", "mxq", "mxr", "mxt", "mxu", "mxv", "mxw", "mxx", "mxy", "mxz", "mya", "myb", "mye", "myf", "myg", "myh", "myj", "myk", "myl", "mym", "myo", "myp", "myr", "mys", "myu", "myv", "myw", "myx", "myy", "myz", "mzb", "mzd", "mze", "mzh", "mzi", "mzj", "mzk", "mzm", "mzn", "mzo", "mzp", "mzq", "mzr", "mzs", "mzt", "mzu", "mzv", "mzw", "mzz", "nab", "nac", "nae", "naf", "nag", "naj", "nak", "nal", "nam", "nap", "naq", "nar", "nas", "nat", "nau", "nav", "naw", "nax", "nay", "naz", "nba", "nbb", "nbc", "nbe", "nbh", "nbi", "nbj", "nbk", "nbl", "nbm", "nbn", "nbo", "nbp", "nbq", "nbr", "nbu", "nbv", "nbw", "nby", "nca", "ncb", "ncc", "ncd", "nce", "ncf", "ncg", "nch", "nci", "ncj", "nck", "ncl", "ncm", "ncn", "nco", "ncr", "nct", "ncu", "ncx", "ncz", "nda", "ndb", "ndc", "ndd", "nde", "ndg", "ndh", "ndi", "ndj", "ndm", "ndn", "ndo", "ndq", "ndr", "nds", "ndt", "ndu", "ndv", "ndx", "ndy", "ndz", "neb", "nec", "nee", "neg", "neh", "nej", "nem", "nen", "neo", "neq", "ner", "net", "neu", "nev", "new", "ney", "nez", "nfa", "nfd", "nfl", "nfr", "nfu", "nga", "ngb", "ngc", "nge", "ngg", "ngh", "ngi", "ngj", "ngk", "ngl", "ngn", "ngp", "ngq", "ngr", "ngs", "ngt", "ngu", "ngv", "ngx", "ngy", "ngz", "nha", "nhb", "nhc", "nhd", "nhe", "nhf", "nhg", "nhi", "nhm", "nhn", "nho", "nhp", "nhq", "nhr", "nht", "nhu", "nhv", "nhw", "nhx", "nhy", "nia", "nib", "nid", "nie", "nif", "nig", "nih", "nii", "nij", "nil", "nim", "nin", "nio", "niq", "nir", "nis", "nit", "niu", "niv", "niw", "niy", "niz", "nja", "njb", "njh", "nji", "njj", "njl", "njm", "njn", "njo", "njr", "nju", "njy", "njz", "nka", "nkb", "nkc", "nkd", "nke", "nkf", "nkg", "nkh", "nki", "nkj", "nkk", "nko", "nkp", "nkr", "nks", "nku", "nkw", "nkx", "nkz", "nla", "nlc", "nld", "nle", "nlg", "nli", "nlj", "nll", "nlo", "nlu", "nlx", "nly", "nlz", "nma", "nmb", "nmc", "nmd", "nme", "nmf", "nmg", "nmh", "nmj", "nmk", "nml", "nmm", "nmn", "nmp", "nms", "nmt", "nmu", "nmv", "nmw", "nmx", "nmy", "nmz", "nna", "nnb", "nnc", "nnd", "nnf", "nng", "nnh", "nnj", "nnk", "nnl", "nnm", "nno", "nnp", "nnq", "nnr", "nnt", "nnu", "nnv", "nnw", "nny", "nnz", "noa", "nob", "noc", "nod", "noe", "nog", "noh", "noj", "nok", "nol", "nom", "non", "nop", "noq", "nor", "not", "nou", "nov", "now", "noz", "nph", "npi", "npl", "npn", "npo", "nps", "npy", "nqg", "nqk", "nqm", "nqo", "nra", "nrb", "nre", "nrf", "nrg", "nri", "nrk", "nrl", "nrm", "nrn", "nrt", "nru", "nrz", "nsa", "nsb", "nse", "nsg", "nsk", "nsm", "nsn", "nso", "nsq", "nss", "nst", "nsu", "nsw", "nsx", "nsy", "nsz", "nti", "ntj", "ntk", "ntm", "nto", "ntp", "ntu", "ntw", "nty", "ntz", "nua", "nuc", "nud", "nuf", "nug", "nui", "nuj", "nuk", "nul", "num", "nun", "nuo", "nup", "nur", "nus", "nut", "nuw", "nux", "nuy", "nuz", "nvh", "nvo", "nwa", "nwb", "nwe", "nwi", "nwr", "nwy", "nxa", "nxe", "nxg", "nxi", "nxl", "nxn", "nxq", "nxr", "nxx", "nya", "nyb", "nyd", "nye", "nyf", "nyh", "nyi", "nyj", "nyk", "nyl", "nym", "nyn", "nyo", "nyp", "nyq", "nys", "nyt", "nyu", "nyv", "nyx", "nyy", "nza", "nzb", "nzd", "nzi", "nzk", "nzm", "nzy", "nzz", "oaa", "oac", "oar", "obi", "obk", "obm", "obo", "obr", "obt", "oca", "och", "oci", "oco", "ocu", "odt", "odu", "ofo", "ofs", "ofu", "ogb", "ogc", "oge", "ogo", "ogu", "oht", "oia", "ojb", "ojc", "ojg", "oji", "ojp", "ojs", "ojv", "ojw", "oka", "okb", "okd", "oke", "oki", "okj", "okn", "oko", "okr", "oks", "oku", "okv", "ola", "old", "ole", "olm", "olo", "olt", "oma", "omb", "omc", "omg", "omk", "oml", "omo", "omr", "omt", "omu", "omw", "omx", "ona", "onb", "one", "ong", "oni", "onj", "onn", "ono", "onp", "ons", "onu", "onw", "ood", "oog", "oon", "opa", "opk", "opm", "opo", "opt", "opy", "ora", "orc", "ore", "org", "orh", "ori", "orm", "oro", "orr", "ort", "oru", "orv", "orw", "orx", "ory", "orz", "osa", "osc", "osi", "osp", "oss", "osx", "ota", "otd", "ote", "oti", "otk", "otl", "otm", "otq", "otr", "ots", "ott", "otu", "otw", "otz", "oua", "oub", "oum", "ovd", "owi", "owl", "oyb", "oyd", "oym", "ozm", "pab", "pac", "pad", "pae", "paf", "pag", "pah", "pai", "pak", "pal", "pam", "pan", "pao", "pap", "paq", "par", "pas", "pau", "pav", "paw", "pay", "paz", "pbb", "pbe", "pbf", "pbg", "pbh", "pbi", "pbn", "pbo", "pbp", "pbr", "pbs", "pbt", "pbu", "pbv", "pby", "pca", "pcb", "pcc", "pcd", "pce", "pci", "pcj", "pck", "pcm", "pcn", "pcp", "pda", "pdc", "pdo", "pdt", "pdu", "peb", "ped", "pee", "pef", "peg", "peh", "pei", "pej", "pek", "pem", "peo", "pep", "peq", "pes", "pex", "pez", "pfe", "pfl", "pga", "pgd", "pgk", "pgn", "pgu", "pha", "phg", "phk", "phl", "phn", "pho", "pht", "pia", "pib", "pic", "pid", "pie", "pif", "pih", "pij", "pil", "pim", "pin", "pio", "pip", "pir", "pis", "pit", "piu", "piv", "piw", "pix", "piz", "pjt", "pka", "pkb", "pkc", "pkg", "pkh", "pkn", "pko", "pkp", "pkt", "pku", "pla", "plb", "plc", "ple", "plg", "plh", "pli", "plj", "plk", "pll", "pln", "plo", "plq", "plr", "pls", "plt", "plu", "plv", "plw", "ply", "plz", "pma", "pmb", "pmf", "pmh", "pmi", "pmk", "pml", "pmm", "pmn", "pmo", "pmq", "pmr", "pms", "pmt", "pmw", "pmy", "pmz", "pna", "pnb", "pne", "png", "pnh", "pni", "pnj", "pnn", "pno", "pnp", "pnq", "pnr", "pns", "pnt", "pnu", "pnv", "pnw", "pny", "poc", "poe", "pof", "pog", "poh", "poi", "pok", "pol", "pom", "pon", "poo", "poq", "por", "pos", "pot", "pov", "pow", "pox", "poy", "ppe", "ppi", "ppk", "ppl", "ppm", "ppn", "ppo", "ppq", "ppt", "ppu", "pqa", "pqm", "prc", "prd", "pre", "prf", "prg", "prh", "pri", "prk", "prm", "prn", "pro", "prq", "prr", "prs", "pru", "prw", "prx", "psa", "pse", "psh", "psi", "psm", "psn", "psq", "pss", "pst", "psu", "psw", "pta", "pth", "pti", "ptn", "pto", "ptp", "ptr", "ptu", "ptv", "pua", "pub", "puc", "pue", "pui", "puj", "pum", "puo", "pup", "puq", "pur", "puu", "puw", "puy", "pwa", "pwg", "pwi", "pwm", "pwn", "pwo", "pym", "pyn", "pyu", "pyy", "qua", "qub", "quc", "que", "quf", "qug", "quh", "qui", "quk", "qul", "qum", "qun", "qup", "qus", "quv", "quw", "qux", "quy", "quz", "qva", "qvc", "qve", "qvi", "qvl", "qvn", "qvo", "qvs", "qvw", "qvy", "qvz", "qwa", "qwh", "qwm", "qwt", "qxa", "qxn", "qxo", "qxp", "qxq", "qxr", "qxs", "qxu", "qxw", "qya", "qyp", "raa", "rab", "rac", "rad", "raf", "rag", "rah", "rai", "raj", "rak", "ral", "ram", "ran", "rao", "rap", "rar", "ras", "rau", "rav", "raw", "ray", "raz", "rbb", "rbp", "rcf", "rea", "reb", "ree", "reg", "rej", "rel", "rem", "ren", "res", "rey", "rga", "rgk", "rgn", "rgr", "rgs", "rhg", "rhp", "rif", "ril", "rim", "rin", "rir", "rit", "riu", "rji", "rjs", "rkb", "rkh", "rki", "rkm", "rma", "rmb", "rmc", "rme", "rmf", "rmg", "rmh", "rmi", "rmk", "rml", "rmm", "rmn", "rmo", "rmp", "rmq", "rmt", "rmu", "rmv", "rmw", "rmy", "rnd", "rng", "rnn", "rnp", "rnr", "rnw", "rob", "rod", "roe", "rof", "rog", "roh", "rol", "rom", "ron", "roo", "rop", "rou", "rpn", "rpt", "rri", "rro", "rsb", "rth", "rtm", "rtw", "rub", "rue", "ruf", "rug", "rui", "ruk", "run", "ruo", "rup", "ruq", "rus", "rut", "ruu", "rwa", "rwk", "rwm", "rwo", "rwr", "ryn", "rys", "ryu", "sab", "sac", "sad", "sae", "saf", "sag", "sah", "saj", "sak", "sam", "san", "sao", "saq", "sar", "sas", "sat", "sau", "sav", "saw", "sax", "say", "saz", "sba", "sbb", "sbc", "sbd", "sbe", "sbf", "sbg", "sbh", "sbi", "sbk", "sbl", "sbm", "sbo", "sbp", "sbq", "sbr", "sbs", "sbt", "sbu", "sbv", "sbw", "scb", "sce", "scg", "sci", "sck", "scl", "scn", "sco", "scp", "scq", "scs", "scu", "scv", "scw", "sda", "sdc", "sde", "sdg", "sdh", "sdj", "sdn", "sdo", "sdp", "sdq", "sds", "sdt", "sdz", "sea", "sec", "sed", "see", "sef", "seh", "sei", "sej", "sek", "sel", "sen", "seo", "seq", "ser", "ses", "set", "seu", "sev", "sew", "sey", "sga", "sgc", "sgd", "sge", "sgh", "sgi", "sgm", "sgp", "sgr", "sgs", "sgt", "sgw", "sgy", "sgz", "sha", "shb", "she", "shg", "shh", "shi", "shj", "shk", "shl", "shn", "sho", "shp", "shq", "shr", "shs", "sht", "shu", "shv", "shw", "shx", "shy", "shz", "sia", "sib", "sid", "sie", "sif", "sig", "sii", "sij", "sik", "sil", "sim", "sin", "sip", "siq", "sir", "sis", "siu", "siw", "six", "siy", "siz", "sja", "sjd", "sje", "sjg", "sjk", "sjl", "sjm", "sjn", "sjo", "sjr", "sjs", "sjt", "sju", "sjw", "ska", "skb", "skc", "skd", "ske", "skf", "skg", "skh", "ski", "skj", "skm", "skr", "sks", "skt", "sku", "skv", "skw", "skx", "sky", "skz", "slc", "slg", "slh", "sli", "slj", "slk", "sll", "slm", "sln", "slp", "slr", "slu", "slv", "slw", "sly", "slz", "sma", "smb", "smc", "sme", "smg", "smj", "smk", "sml", "smn", "smo", "smp", "smq", "smr", "sms", "smw", "smx", "smy", "sna", "snc", "snd", "sne", "snf", "sng", "sni", "snk", "snl", "snn", "sno", "snp", "snq", "snr", "sns", "snu", "snv", "snw", "snx", "sny", "snz", "sob", "soc", "sod", "soe", "sog", "soh", "sok", "sol", "som", "sop", "soq", "sor", "sos", "sot", "sou", "sov", "sow", "soz", "spa", "spb", "spc", "spd", "spe", "spi", "spl", "spm", "spn", "spo", "spp", "spr", "sps", "spu", "spx", "spy", "sqa", "sqi", "sqn", "sqo", "sqq", "sqr", "sqt", "squ", "sra", "srb", "src", "srd", "srf", "srh", "sri", "srk", "srl", "srm", "srn", "sro", "srp", "srq", "srr", "srs", "sru", "srv", "srw", "sry", "srz", "ssb", "ssc", "ssd", "sse", "ssf", "ssg", "ssj", "ssk", "ssl", "ssm", "sso", "ssp", "ssq", "sss", "sst", "ssu", "ssv", "ssw", "ssx", "ssy", "ssz", "ste", "stf", "stg", "sth", "sti", "stj", "stk", "stl", "stn", "sto", "stp", "stq", "str", "sts", "stu", "stv", "stw", "sua", "sub", "suc", "sue", "sug", "sui", "suj", "suk", "sun", "suq", "sur", "sus", "sut", "suv", "suw", "sux", "suy", "suz", "sva", "svb", "svc", "sve", "svs", "swb", "swc", "swe", "swf", "swg", "swh", "swi", "swj", "swm", "swn", "swo", "swp", "swq", "sws", "swt", "swu", "swv", "sww", "swx", "swy", "sxb", "sxe", "sxg", "sxm", "sxn", "sxr", "sxu", "sya", "syb", "syc", "syi", "syk", "syl", "sym", "syn", "syr", "sys", "syw", "sza", "szb", "szc", "szd", "sze", "szg", "szl", "szn", "szp", "szv", "szw", "taa", "tab", "tac", "tad", "tae", "taf", "tag", "tah", "taj", "tal", "tam", "tan", "tao", "tap", "taq", "tar", "tas", "tat", "tau", "tav", "tay", "taz", "tba", "tbc", "tbd", "tbe", "tbf", "tbg", "tbh", "tbi", "tbj", "tbk", "tbl", "tbm", "tbn", "tbo", "tbp", "tbr", "tbs", "tbt", "tbu", "tbv", "tbw", "tbx", "tby", "tbz", "tca", "tcb", "tcc", "tcd", "tce", "tcf", "tcg", "tci", "tck", "tcl", "tcm", "tcn", "tcq", "tcs", "tct", "tcx", "tcy", "tcz", "tda", "tdc", "tdd", "tde", "tdf", "tdg", "tdh", "tdi", "tdj", "tdk", "tdl", "tdm", "tdn", "tdo", "tdr", "tds", "tdt", "tdv", "tdx", "tdy", "tea", "tec", "ted", "tee", "teg", "teh", "tei", "tek", "tel", "tem", "ten", "teo", "tep", "teq", "ter", "tes", "tet", "teu", "tev", "tew", "tex", "tey", "tfn", "tfr", "tft", "tgb", "tgc", "tgd", "tgh", "tgi", "tgj", "tgk", "tgl", "tgn", "tgo", "tgp", "tgq", "tgr", "tgs", "tgt", "tgu", "tgw", "tgx", "tgy", "tha", "thd", "the", "thf", "thk", "thl", "thm", "thp", "thq", "thr", "ths", "tht", "thv", "thz", "tia", "tic", "tif", "tig", "tih", "tii", "tij", "tik", "til", "tim", "tin", "tio", "tip", "tiq", "tir", "tis", "tit", "tiu", "tiv", "tiw", "tix", "tiy", "tiz", "tjg", "tji", "tjm", "tjs", "tju", "tjw", "tkd", "tke", "tkg", "tkl", "tkm", "tkn", "tkp", "tkq", "tkr", "tks", "tku", "tkw", "tlb", "tlc", "tld", "tlf", "tlg", "tlh", "tli", "tlj", "tlk", "tll", "tlm", "tlo", "tlp", "tlq", "tlr", "tls", "tlt", "tlu", "tlv", "tlx", "tly", "tma", "tmb", "tmc", "tmd", "tmf", "tmi", "tmj", "tmm", "tmn", "tmo", "tmq", "tmr", "tms", "tmt", "tmu", "tmw", "tmy", "tmz", "tna", "tnc", "tnh", "tni", "tnk", "tnl", "tnm", "tnn", "tno", "tnp", "tnq", "tnr", "tnt", "tnw", "tnx", "tnz", "tob", "toc", "tof", "tog", "toh", "toi", "toj", "tol", "tom", "ton", "too", "top", "tos", "tow", "tox", "tpa", "tpc", "tpe", "tpf", "tpg", "tpi", "tpj", "tpl", "tpm", "tpn", "tpp", "tpr", "tpt", "tpu", "tpw", "tpx", "tpy", "tpz", "tqb", "tql", "tqo", "tqp", "tqq", "tqu", "tqw", "tra", "trc", "trd", "trf", "trg", "trh", "tri", "trm", "trn", "tro", "trp", "trq", "trr", "trs", "trt", "tru", "trv", "trw", "trx", "trz", "tsa", "tsb", "tsc", "tsd", "tsg", "tsh", "tsi", "tsj", "tsk", "tsl", "tsn", "tso", "tsr", "tsu", "tsv", "tsw", "tsx", "tsz", "tta", "ttc", "ttd", "tte", "ttf", "ttg", "tti", "ttj", "ttk", "ttm", "ttn", "ttq", "ttr", "tts", "ttt", "ttu", "ttv", "ttw", "tty", "tua", "tub", "tuc", "tud", "tue", "tuf", "tug", "tuh", "tui", "tuk", "tul", "tum", "tun", "tuo", "tuq", "tur", "tus", "tuu", "tuv", "tux", "tuy", "tuz", "tva", "tvd", "tve", "tvk", "tvl", "tvm", "tvo", "tvs", "tvu", "tvw", "tvy", "twa", "twb", "twd", "twe", "twf", "twg", "twh", "twi", "twm", "twp", "twq", "twt", "twu", "tww", "twx", "twy", "txa", "txb", "txc", "txe", "txg", "txh", "txi", "txm", "txn", "txo", "txs", "txt", "txu", "txx", "txy", "tya", "tye", "tyi", "tyj", "tyn", "typ", "tyr", "tyu", "tyv", "tyx", "tyz", "tzh", "tzj", "tzl", "tzm", "tzn", "tzo", "tzx", "uam", "uan", "uar", "uba", "ubi", "ubl", "ubr", "ubu", "uby", "uda", "ude", "udi", "udj", "udl", "udm", "udu", "ues", "ufi", "uga", "uge", "ugo", "uha", "uhn", "uig", "uiv", "uji", "uka", "ukg", "ukh", "uki", "ukp", "ukq", "ukr", "uku", "ukv", "ula", "ulc", "ule", "ulf", "uli", "ulk", "ulm", "uln", "ulu", "ulw", "uma", "umb", "umc", "umg", "umm", "umo", "ump", "umr", "ums", "umu", "una", "und", "une", "ung", "unk", "unm", "unn", "unr", "unz", "upv", "ura", "urb", "urc", "urd", "ure", "urf", "urg", "urh", "uri", "urk", "urn", "urp", "urr", "urt", "uru", "urv", "urw", "urx", "ury", "urz", "usa", "ush", "usk", "usp", "usu", "uta", "ute", "utp", "utr", "utu", "uum", "uur", "uuu", "uve", "uvh", "uvl", "uwa", "uya", "uzn", "uzs", "vaa", "vae", "vaf", "vag", "vah", "vai", "vaj", "val", "vam", "van", "vao", "var", "vas", "vau", "vay", "vbb", "vbk", "vec", "ved", "vel", "vem", "ven", "veo", "vep", "ver", "vic", "vie", "vif", "vig", "vil", "vin", "vit", "viv", "vka", "vkj", "vkl", "vkm", "vko", "vkp", "vku", "vlp", "vls", "vma", "vmb", "vme", "vmf", "vmg", "vmi", "vmk", "vml", "vmm", "vmp", "vmr", "vmw", "vmy", "vmz", "vnk", "vnm", "vnp", "vol", "vor", "vot", "vra", "vro", "vrs", "vrt", "vun", "vut", "waa", "wab", "wac", "wad", "wae", "wah", "waj", "wal", "wam", "wan", "wao", "wap", "waq", "war", "was", "wat", "wau", "wav", "waw", "wax", "way", "waz", "wba", "wbb", "wbe", "wbf", "wbh", "wbi", "wbj", "wbk", "wbl", "wbm", "wbp", "wbr", "wbt", "wbv", "wbw", "wca", "wci", "wdd", "wdg", "wdj", "wea", "wed", "weg", "weh", "wei", "weo", "wep", "wer", "wes", "wet", "wew", "wfg", "wga", "wgb", "wgg", "wgi", "wgo", "wgu", "wgy", "wha", "whk", "wib", "wic", "wig", "wih", "wii", "wik", "wil", "wim", "win", "wir", "wiu", "wiv", "wiy", "wja", "wji", "wka", "wkw", "wla", "wlc", "wle", "wlg", "wlh", "wli", "wlk", "wll", "wlm", "wln", "wlo", "wlr", "wls", "wlu", "wlv", "wlw", "wly", "wmb", "wmc", "wmd", "wme", "wmh", "wmi", "wmo", "wms", "wmt", "wmw", "wmx", "wnb", "wnc", "wnd", "wne", "wni", "wnk", "wno", "wnp", "wnu", "wnw", "wny", "woa", "wob", "woc", "wod", "woe", "wof", "wog", "woi", "wok", "wol", "wom", "won", "wos", "wow", "wpc", "wrb", "wrg", "wrh", "wri", "wrk", "wrl", "wrm", "wrn", "wro", "wrp", "wrr", "wrs", "wru", "wrv", "wrw", "wry", "wrz", "wsa", "wsg", "wsi", "wsk", "wsr", "wsv", "wtf", "wth", "wti", "wtm", "wtw", "wub", "wuh", "wul", "wum", "wun", "wut", "wuu", "wuv", "wux", "wuy", "wwa", "wwo", "wwr", "www", "wxa", "wyb", "wyi", "wym", "wyr", "wyy", "xaa", "xab", "xac", "xad", "xai", "xaj", "xal", "xam", "xan", "xap", "xaq", "xar", "xas", "xat", "xau", "xav", "xaw", "xbc", "xbe", "xbg", "xbi", "xbm", "xbn", "xbr", "xbw", "xce", "xcg", "xch", "xcl", "xcm", "xcn", "xco", "xcr", "xct", "xcw", "xcy", "xdc", "xdk", "xdm", "xdy", "xeb", "xed", "xeg", "xel", "xem", "xep", "xer", "xes", "xet", "xeu", "xfa", "xga", "xgd", "xgf", "xgm", "xhd", "xho", "xht", "xhu", "xib", "xii", "xil", "xin", "xir", "xiy", "xjt", "xka", "xkb", "xke", "xkg", "xkl", "xkn", "xkq", "xkr", "xks", "xkv", "xky", "xla", "xlb", "xlc", "xld", "xlg", "xlo", "xlp", "xls", "xlu", "xly", "xmb", "xmf", "xmg", "xmh", "xmk", "xmm", "xmn", "xmr", "xmt", "xmu", "xmv", "xmw", "xmy", "xmz", "xnb", "xng", "xni", "xnm", "xnn", "xno", "xns", "xny", "xnz", "xod", "xog", "xok", "xom", "xon", "xoo", "xop", "xor", "xow", "xpb", "xpd", "xpe", "xpf", "xpg", "xpk", "xpm", "xpq", "xpr", "xpt", "xpu", "xpv", "xpy", "xqt", "xra", "xre", "xri", "xrn", "xrt", "xru", "xrw", "xsa", "xsb", "xsc", "xse", "xsh", "xsi", "xsl", "xsm", "xsp", "xsr", "xss", "xsu", "xsv", "xsy", "xta", "xtc", "xtd", "xte", "xtg", "xtm", "xto", "xtv", "xua", "xub", "xug", "xul", "xum", "xuo", "xup", "xur", "xut", "xuu", "xvi", "xvn", "xvs", "xwa", "xwc", "xwd", "xwg", "xwk", "xwo", "xww", "xxk", "xxt", "xyj", "xyk", "xyt", "xyy", "xzh", "xzp", "yaa", "yab", "yad", "yae", "yaf", "yag", "yah", "yai", "yaj", "yak", "yal", "yam", "yan", "yao", "yap", "yaq", "yar", "yas", "yat", "yau", "yav", "yaw", "yay", "yaz", "yba", "ybb", "ybe", "ybh", "ybi", "ybj", "ybl", "ybm", "ybn", "ybo", "ybx", "yby", "ycl", "ycn", "yda", "ydd", "ydg", "ydk", "yea", "yec", "yee", "yer", "yes", "yet", "yev", "yey", "yga", "ygr", "ygw", "yha", "yhl", "yia", "yig", "yih", "yii", "yij", "yim", "yis", "yix", "yka", "ykg", "yki", "ykm", "yko", "ykr", "ykt", "yky", "yle", "ylg", "yli", "yll", "yln", "ylr", "ylu", "yly", "ymb", "ymc", "yme", "yml", "ymm", "ymo", "ymp", "ynd", "ynk", "ynl", "ynn", "ynq", "yns", "ynu", "yob", "yog", "yoi", "yok", "yol", "yom", "yon", "yor", "yot", "yox", "yoy", "ypg", "yph", "ypz", "yra", "yrb", "yre", "yrk", "yrl", "yrn", "yrw", "ysn", "ysr", "yss", "yua", "yub", "yuc", "yue", "yuf", "yug", "yui", "yuj", "yuk", "yul", "yum", "yun", "yup", "yur", "yut", "yuw", "yux", "yuy", "yuz", "yva", "yvt", "ywa", "ywn", "ywq", "ywr", "ywt", "yww", "yxg", "yxl", "yyr", "yyu", "yzg", "zaa", "zab", "zac", "zad", "zae", "zaf", "zag", "zai", "zaj", "zak", "zal", "zao", "zap", "zaq", "zar", "zav", "zaw", "zay", "zaz", "zba", "zbc", "zbe", "zbl", "zbw", "zca", "zch", "zdj", "zea", "zeg", "zen", "zga", "zgb", "zgh", "zgn", "zgr", "zha", "zhb", "zhd", "zhi", "zhn", "zho", "zhw", "zia", "zik", "zim", "zin", "ziw", "ziz", "zka", "zkb", "zkd", "zkg", "zkk", "zko", "zkp", "zkr", "zkt", "zku", "zlj", "zlm", "zln", "zma", "zmb", "zmc", "zmd", "zme", "zmg", "zmj", "zmk", "zml", "zmm", "zmn", "zmo", "zmp", "zmr", "zmt", "zmu", "zmv", "zmw", "zmx", "zmy", "zmz", "zne", "zng", "zns", "zoc", "zoh", "zom", "zoq", "zor", "zos", "zpc", "zpf", "zpi", "zpl", "zpm", "zpn", "zpo", "zpq", "zpr", "zps", "zpu", "zpv", "zpw", "zpx", "zpz", "zqe", "zro", "zrp", "zrs", "zsa", "zsm", "zsu", "ztg", "ztp", "ztq", "zts", "ztu", "zua", "zuh", "zul", "zum", "zun", "zwa", "zyb", "zyg", "zyj", "zyn", "zzj"], "license": "cc0-1.0", "size_categories": ["10M<n<100M"], "configs": [{"config_name": "panlex", "data_files": "panlex.csv", "sep": ";"}], "tags": ["lexicons"]} | 2024-02-03T17:36:12+00:00 | [] | [
"aaa",
"aab",
"aac",
"aad",
"aae",
"aag",
"aai",
"aak",
"aal",
"aan",
"aao",
"aap",
"aaq",
"aar",
"aas",
"aat",
"aau",
"aaw",
"aax",
"aba",
"abb",
"abe",
"abf",
"abg",
"abi",
"abj",
"abk",
"abl",
"abm",
"abn",
"abo",
"abq",
"abr",
"abs",
"abt",
"abu",
"abv",
"abw",
"abx",
"aby",
"abz",
"aca",
"acd",
"ace",
"acf",
"ach",
"aci",
"ack",
"acl",
"acm",
"acn",
"acp",
"acq",
"acr",
"acs",
"act",
"acu",
"acv",
"acw",
"acx",
"acy",
"ada",
"add",
"ade",
"adg",
"adh",
"adi",
"adj",
"adl",
"adn",
"ado",
"adq",
"adr",
"adt",
"adw",
"adx",
"ady",
"adz",
"aeb",
"aec",
"aee",
"ael",
"aem",
"aer",
"aes",
"aew",
"aey",
"aez",
"afb",
"afd",
"afe",
"afi",
"afk",
"afn",
"afo",
"afp",
"afr",
"aft",
"afu",
"afz",
"agb",
"agc",
"agd",
"age",
"agf",
"agg",
"agh",
"agj",
"agl",
"agm",
"agn",
"ago",
"agq",
"agr",
"ags",
"agt",
"agu",
"agv",
"agw",
"agx",
"aha",
"ahb",
"ahg",
"ahh",
"ahi",
"ahk",
"ahl",
"aho",
"ahr",
"ahs",
"aht",
"aia",
"aib",
"aid",
"aie",
"aif",
"aih",
"aii",
"aik",
"ail",
"aim",
"ain",
"aio",
"air",
"ait",
"aiw",
"aix",
"aja",
"ajg",
"aji",
"ajp",
"ajz",
"aka",
"akb",
"akc",
"akd",
"ake",
"akf",
"akg",
"akh",
"aki",
"akj",
"akk",
"akl",
"akm",
"ako",
"akp",
"akq",
"akr",
"aks",
"aku",
"akv",
"akw",
"akx",
"aky",
"akz",
"ala",
"alc",
"ald",
"ale",
"alf",
"alh",
"ali",
"alj",
"alk",
"all",
"alm",
"aln",
"alo",
"alp",
"alq",
"alr",
"als",
"alt",
"alu",
"alw",
"alx",
"aly",
"alz",
"ama",
"amc",
"ame",
"amf",
"amg",
"amh",
"ami",
"amj",
"amk",
"aml",
"amm",
"amn",
"amo",
"amp",
"amq",
"amr",
"ams",
"amt",
"amu",
"amv",
"amw",
"amx",
"amy",
"amz",
"ana",
"anb",
"anc",
"and",
"ane",
"anf",
"ang",
"anh",
"ani",
"anj",
"ank",
"anm",
"ann",
"ano",
"anp",
"anq",
"ant",
"anu",
"anv",
"anw",
"anx",
"any",
"anz",
"aoa",
"aoc",
"aod",
"aof",
"aog",
"aoi",
"aoj",
"aol",
"aom",
"aon",
"aor",
"aos",
"aot",
"aou",
"aox",
"aoz",
"apb",
"apc",
"apd",
"ape",
"aph",
"api",
"apj",
"apk",
"apl",
"apm",
"apn",
"apo",
"app",
"apq",
"apr",
"apt",
"apu",
"apw",
"apx",
"apy",
"apz",
"aqc",
"aqd",
"aqg",
"aqm",
"aqp",
"aqt",
"aqz",
"ara",
"arb",
"arc",
"ard",
"are",
"arg",
"arh",
"ari",
"ark",
"arl",
"arn",
"aro",
"arp",
"arq",
"arr",
"ars",
"arv",
"arw",
"arx",
"ary",
"arz",
"asa",
"asb",
"asc",
"ase",
"asg",
"ash",
"asi",
"asj",
"ask",
"asl",
"asm",
"asn",
"aso",
"asr",
"ass",
"ast",
"asu",
"asv",
"asx",
"asy",
"asz",
"ata",
"atb",
"atc",
"atd",
"ate",
"ati",
"atj",
"atk",
"atm",
"ato",
"atq",
"atr",
"ats",
"att",
"atv",
"atw",
"atx",
"aty",
"atz",
"aua",
"auc",
"aud",
"auh",
"aui",
"auj",
"auk",
"aul",
"aum",
"aun",
"aup",
"aut",
"auu",
"auw",
"auy",
"auz",
"ava",
"avb",
"avd",
"ave",
"avi",
"avk",
"avm",
"avn",
"avt",
"avu",
"avv",
"awa",
"awb",
"awc",
"awe",
"awg",
"awh",
"awi",
"awk",
"awm",
"awn",
"awo",
"awr",
"aws",
"awt",
"awv",
"awx",
"awy",
"axb",
"axe",
"axg",
"axk",
"axl",
"axm",
"axx",
"aya",
"ayb",
"ayc",
"ayd",
"aye",
"ayg",
"ayh",
"ayi",
"ayl",
"aym",
"ayn",
"ayo",
"ayr",
"ayu",
"ayz",
"aza",
"azb",
"azd",
"azg",
"azj",
"azm",
"azn",
"azo",
"azz",
"baa",
"bab",
"bae",
"baf",
"bag",
"baj",
"bak",
"bal",
"bam",
"ban",
"bao",
"bap",
"bar",
"bas",
"bau",
"bav",
"baw",
"bax",
"bba",
"bbb",
"bbc",
"bbd",
"bbe",
"bbf",
"bbh",
"bbi",
"bbj",
"bbk",
"bbl",
"bbn",
"bbp",
"bbq",
"bbr",
"bbs",
"bbt",
"bbu",
"bbv",
"bbw",
"bby",
"bca",
"bcc",
"bcd",
"bce",
"bcf",
"bcg",
"bch",
"bci",
"bcj",
"bck",
"bcl",
"bcm",
"bcn",
"bco",
"bcp",
"bcq",
"bcr",
"bcs",
"bcu",
"bcv",
"bcw",
"bcy",
"bcz",
"bda",
"bdb",
"bdc",
"bdd",
"bde",
"bdg",
"bdh",
"bdi",
"bdj",
"bdk",
"bdl",
"bdm",
"bdn",
"bdp",
"bdq",
"bdr",
"bds",
"bdu",
"bdw",
"bdx",
"bdy",
"bea",
"beb",
"bec",
"bed",
"bee",
"bef",
"beg",
"beh",
"bei",
"bej",
"bek",
"bel",
"bem",
"ben",
"beo",
"bep",
"beq",
"bet",
"beu",
"bev",
"bew",
"bex",
"bey",
"bez",
"bfa",
"bfc",
"bfd",
"bff",
"bfg",
"bfh",
"bfj",
"bfl",
"bfm",
"bfn",
"bfp",
"bfq",
"bfs",
"bft",
"bfu",
"bfw",
"bfy",
"bfz",
"bgc",
"bgd",
"bgf",
"bgg",
"bgi",
"bgj",
"bgk",
"bgl",
"bgn",
"bgo",
"bgp",
"bgq",
"bgr",
"bgs",
"bgt",
"bgu",
"bgv",
"bgz",
"bhb",
"bhc",
"bhd",
"bhf",
"bhg",
"bhh",
"bhj",
"bhl",
"bhn",
"bho",
"bhp",
"bhq",
"bhr",
"bht",
"bhw",
"bhy",
"bhz",
"bia",
"bib",
"bid",
"bie",
"bif",
"big",
"bil",
"bim",
"bin",
"bio",
"biq",
"bir",
"bis",
"bit",
"biu",
"biv",
"biw",
"biy",
"biz",
"bja",
"bjb",
"bjc",
"bje",
"bjf",
"bjg",
"bjh",
"bji",
"bjk",
"bjl",
"bjm",
"bjn",
"bjp",
"bjr",
"bjs",
"bjt",
"bjv",
"bjw",
"bjy",
"bjz",
"bkc",
"bkd",
"bkh",
"bki",
"bkj",
"bkk",
"bkl",
"bkm",
"bkn",
"bko",
"bkq",
"bkr",
"bks",
"bku",
"bkv",
"bkw",
"bkx",
"bky",
"bkz",
"bla",
"blb",
"blc",
"bld",
"ble",
"blf",
"blh",
"bli",
"blj",
"blk",
"bll",
"blm",
"bln",
"blp",
"blq",
"blr",
"bls",
"blt",
"blv",
"blw",
"blx",
"bly",
"blz",
"bmb",
"bmc",
"bmd",
"bme",
"bmf",
"bmg",
"bmh",
"bmi",
"bmk",
"bmm",
"bmn",
"bmo",
"bmr",
"bmt",
"bmu",
"bmv",
"bmx",
"bmz",
"bna",
"bnb",
"bnc",
"bnd",
"bne",
"bnf",
"bng",
"bni",
"bnj",
"bnk",
"bnl",
"bnm",
"bnn",
"bno",
"bnp",
"bnq",
"bnr",
"bns",
"bnv",
"bnw",
"bnx",
"bny",
"bnz",
"boa",
"bob",
"bod",
"boe",
"bof",
"boh",
"boi",
"boj",
"bol",
"bom",
"bon",
"bop",
"boq",
"bor",
"bos",
"bot",
"bou",
"bov",
"box",
"boy",
"boz",
"bpa",
"bpg",
"bph",
"bpi",
"bpj",
"bpk",
"bpm",
"bpn",
"bpp",
"bpq",
"bpr",
"bps",
"bpu",
"bpv",
"bpw",
"bpy",
"bpz",
"bqc",
"bqg",
"bqi",
"bql",
"bqp",
"bqq",
"bqr",
"bqs",
"bqt",
"bqv",
"bqw",
"bqz",
"brb",
"brc",
"brd",
"bre",
"brf",
"brg",
"brh",
"bri",
"brj",
"brk",
"brl",
"brm",
"brn",
"brp",
"brq",
"brr",
"brt",
"bru",
"brv",
"brx",
"brz",
"bsb",
"bsc",
"bse",
"bsf",
"bsg",
"bsh",
"bsi",
"bsk",
"bsl",
"bsm",
"bsn",
"bsp",
"bsq",
"bss",
"bst",
"bsu",
"bsw",
"bsx",
"bsy",
"bta",
"btc",
"btd",
"bte",
"btf",
"bth",
"btj",
"btm",
"btn",
"bto",
"btp",
"btq",
"btr",
"bts",
"btt",
"btu",
"btv",
"btw",
"btx",
"bty",
"btz",
"bua",
"bub",
"buc",
"bud",
"bue",
"buf",
"bug",
"buh",
"buk",
"bul",
"bum",
"bun",
"buo",
"bup",
"buq",
"bus",
"but",
"buw",
"bux",
"buy",
"bva",
"bvb",
"bvc",
"bvd",
"bvg",
"bvi",
"bvj",
"bvk",
"bvm",
"bvn",
"bvo",
"bvq",
"bvr",
"bvv",
"bvw",
"bvx",
"bvz",
"bwa",
"bwb",
"bwd",
"bwe",
"bwf",
"bwg",
"bwi",
"bwk",
"bwl",
"bwm",
"bwn",
"bwo",
"bwp",
"bwq",
"bwr",
"bws",
"bwt",
"bwu",
"bww",
"bwx",
"bxa",
"bxb",
"bxd",
"bxe",
"bxf",
"bxg",
"bxh",
"bxi",
"bxj",
"bxk",
"bxl",
"bxm",
"bxn",
"bxq",
"bxr",
"bxu",
"bxv",
"bxw",
"bxz",
"bya",
"byd",
"bye",
"byf",
"byi",
"byk",
"byl",
"bym",
"byn",
"byo",
"byp",
"byq",
"byr",
"bys",
"byt",
"byv",
"byw",
"byx",
"byz",
"bza",
"bzb",
"bzc",
"bzd",
"bze",
"bzf",
"bzg",
"bzh",
"bzi",
"bzj",
"bzk",
"bzl",
"bzn",
"bzp",
"bzq",
"bzr",
"bzt",
"bzu",
"bzv",
"bzw",
"bzx",
"bzy",
"bzz",
"caa",
"cab",
"cac",
"cad",
"cae",
"caf",
"cag",
"cah",
"caj",
"cak",
"cal",
"cam",
"can",
"cao",
"cap",
"caq",
"car",
"cas",
"cat",
"cav",
"caw",
"cax",
"cay",
"caz",
"cbb",
"cbc",
"cbd",
"cbg",
"cbi",
"cbj",
"cbk",
"cbn",
"cbo",
"cbr",
"cbs",
"cbt",
"cbu",
"cbv",
"cby",
"ccc",
"cce",
"ccg",
"cch",
"ccj",
"cco",
"ccp",
"ccr",
"cdf",
"cdh",
"cdj",
"cdm",
"cdn",
"cdo",
"cdr",
"cdy",
"cdz",
"cea",
"ceb",
"ceg",
"ces",
"cet",
"cfa",
"cfd",
"cfg",
"cfm",
"cgc",
"cgg",
"cha",
"chb",
"chc",
"chd",
"che",
"chf",
"chg",
"chh",
"chj",
"chk",
"chl",
"chm",
"chn",
"cho",
"chp",
"chq",
"chr",
"cht",
"chu",
"chv",
"chw",
"chx",
"chy",
"chz",
"cia",
"cic",
"cid",
"cie",
"cik",
"cim",
"cin",
"cip",
"cir",
"ciw",
"ciy",
"cja",
"cje",
"cjh",
"cji",
"cjk",
"cjm",
"cjn",
"cjo",
"cjp",
"cjs",
"cjv",
"cjy",
"ckb",
"ckh",
"ckl",
"cko",
"ckq",
"cks",
"ckt",
"cku",
"ckv",
"ckx",
"cla",
"clc",
"cld",
"cle",
"clh",
"cli",
"clk",
"cll",
"clm",
"clo",
"clw",
"cly",
"cme",
"cmg",
"cmi",
"cml",
"cmn",
"cmo",
"cnb",
"cng",
"cnh",
"cni",
"cnk",
"cnl",
"cnp",
"cnr",
"cns",
"cnt",
"cnu",
"cnx",
"cob",
"coc",
"cod",
"coe",
"cof",
"cog",
"coh",
"coj",
"cok",
"col",
"com",
"con",
"coo",
"cop",
"coq",
"cor",
"cos",
"cot",
"cou",
"cow",
"coz",
"cpa",
"cpb",
"cpc",
"cpg",
"cpi",
"cpn",
"cps",
"cpu",
"cpx",
"cqd",
"cra",
"crb",
"crc",
"crd",
"cre",
"crg",
"crh",
"cri",
"crj",
"crk",
"crl",
"crm",
"crn",
"cro",
"crq",
"crr",
"crs",
"crt",
"crw",
"crx",
"cry",
"crz",
"csa",
"csb",
"csh",
"csi",
"csm",
"cso",
"csp",
"css",
"cst",
"csw",
"csy",
"csz",
"cta",
"ctc",
"ctd",
"cte",
"ctg",
"ctl",
"ctm",
"ctn",
"cto",
"ctp",
"cts",
"ctu",
"ctz",
"cua",
"cub",
"cuc",
"cuh",
"cui",
"cuj",
"cuk",
"cul",
"cuo",
"cup",
"cuq",
"cur",
"cut",
"cuu",
"cuv",
"cux",
"cuy",
"cvg",
"cvn",
"cwd",
"cwe",
"cwg",
"cwt",
"cya",
"cyb",
"cym",
"cyo",
"czh",
"czk",
"czn",
"czo",
"czt",
"daa",
"dac",
"dad",
"dag",
"dah",
"dai",
"daj",
"dak",
"dal",
"dam",
"dan",
"daq",
"dar",
"dau",
"dav",
"daw",
"dax",
"dba",
"dbb",
"dbd",
"dbe",
"dbf",
"dbg",
"dbi",
"dbj",
"dbl",
"dbm",
"dbn",
"dbp",
"dbq",
"dbt",
"dbu",
"dbw",
"dby",
"dcc",
"dcr",
"dda",
"ddd",
"ddg",
"ddi",
"ddj",
"ddn",
"ddo",
"dds",
"ddw",
"dec",
"ded",
"dee",
"deg",
"dei",
"del",
"dem",
"den",
"dep",
"der",
"des",
"deu",
"dev",
"dga",
"dgb",
"dgc",
"dgd",
"dge",
"dgg",
"dgh",
"dgi",
"dgl",
"dgo",
"dgr",
"dgt",
"dgw",
"dgz",
"dhd",
"dhg",
"dhi",
"dhl",
"dhr",
"dhs",
"dhu",
"dhv",
"dia",
"dib",
"dic",
"did",
"dif",
"dig",
"dih",
"dii",
"dij",
"dik",
"dil",
"dim",
"din",
"dio",
"dip",
"diq",
"dir",
"dis",
"diu",
"div",
"diw",
"dix",
"diz",
"dja",
"djb",
"djc",
"djd",
"dje",
"dji",
"djj",
"djk",
"djm",
"djn",
"djr",
"dju",
"djw",
"dkr",
"dks",
"dlg",
"dlk",
"dlm",
"dma",
"dmb",
"dmc",
"dmd",
"dme",
"dmf",
"dmg",
"dmk",
"dml",
"dmm",
"dmo",
"dmr",
"dms",
"dmu",
"dmv",
"dmw",
"dmy",
"dna",
"dnd",
"dne",
"dng",
"dni",
"dnj",
"dnn",
"dnr",
"dnt",
"dnu",
"dnw",
"dny",
"doa",
"dob",
"doc",
"doe",
"dof",
"doh",
"doi",
"dok",
"dol",
"don",
"doo",
"dor",
"dos",
"dot",
"dow",
"dox",
"doy",
"doz",
"dpp",
"drb",
"drd",
"dre",
"drg",
"dri",
"drl",
"drn",
"dro",
"drs",
"dru",
"dry",
"dsb",
"dsh",
"dsn",
"dso",
"dsq",
"dta",
"dtb",
"dtd",
"dth",
"dti",
"dtk",
"dtm",
"dto",
"dtp",
"dtr",
"dts",
"dtt",
"dtu",
"dty",
"dua",
"duc",
"due",
"duf",
"dug",
"duk",
"dum",
"dun",
"duo",
"dup",
"duq",
"dur",
"dus",
"duu",
"duv",
"duw",
"dux",
"dva",
"dwa",
"dwk",
"dwr",
"dws",
"dwu",
"dww",
"dya",
"dyb",
"dyd",
"dyi",
"dym",
"dyn",
"dyo",
"dyu",
"dyy",
"dze",
"dzg",
"dzo",
"ebg",
"ebk",
"ebo",
"ebr",
"ebu",
"ecy",
"eee",
"efa",
"efi",
"ega",
"egl",
"ego",
"egy",
"ehu",
"eip",
"eit",
"eiv",
"eja",
"eka",
"eke",
"ekg",
"ekk",
"ekl",
"ekm",
"eko",
"ekp",
"ekr",
"eky",
"ele",
"eli",
"elk",
"ell",
"elm",
"elo",
"elx",
"ema",
"emb",
"eme",
"emg",
"emi",
"emk",
"emn",
"emp",
"ems",
"emw",
"emy",
"ena",
"enb",
"enc",
"end",
"enf",
"eng",
"enh",
"enl",
"enm",
"enn",
"eno",
"enq",
"enr",
"enw",
"enx",
"eot",
"epi",
"epo",
"erg",
"erh",
"eri",
"erk",
"ero",
"err",
"ers",
"ert",
"erw",
"ese",
"esg",
"esh",
"esi",
"esk",
"eso",
"esq",
"ess",
"esu",
"etb",
"etc",
"etn",
"eto",
"etr",
"ets",
"ett",
"etu",
"etx",
"etz",
"eus",
"eve",
"evh",
"evn",
"ewe",
"ewo",
"ext",
"eya",
"eyo",
"faa",
"fab",
"fad",
"faf",
"fag",
"fai",
"faj",
"fak",
"fal",
"fam",
"fan",
"fao",
"fap",
"far",
"fat",
"fau",
"fax",
"fay",
"fbl",
"fer",
"ffm",
"fgr",
"fia",
"fie",
"fij",
"fil",
"fin",
"fip",
"fir",
"fit",
"fkk",
"fkv",
"fla",
"flh",
"fli",
"fll",
"fln",
"flr",
"fly",
"fmp",
"fng",
"fni",
"fod",
"foi",
"fom",
"fon",
"for",
"fos",
"fpe",
"fqs",
"fra",
"frc",
"frd",
"frk",
"frm",
"fro",
"frp",
"frq",
"frr",
"frt",
"fry",
"fub",
"fuc",
"fud",
"fuf",
"fuh",
"fuj",
"ful",
"fun",
"fur",
"fut",
"fuu",
"fuv",
"fuy",
"fvr",
"fwa",
"gaa",
"gab",
"gac",
"gad",
"gae",
"gaf",
"gag",
"gah",
"gai",
"gaj",
"gal",
"gam",
"gan",
"gao",
"gap",
"gaq",
"gar",
"gat",
"gaw",
"gax",
"gay",
"gaz",
"gba",
"gbb",
"gbd",
"gbe",
"gbg",
"gbi",
"gbj",
"gbm",
"gbp",
"gbr",
"gbu",
"gbw",
"gbx",
"gby",
"gbz",
"gcc",
"gcd",
"gce",
"gcf",
"gcl",
"gcn",
"gcr",
"gdb",
"gdc",
"gdd",
"gde",
"gdf",
"gdg",
"gdh",
"gdi",
"gdj",
"gdl",
"gdm",
"gdn",
"gdo",
"gdq",
"gdr",
"gdu",
"gdx",
"gea",
"geb",
"ged",
"geg",
"geh",
"gei",
"gej",
"gek",
"gel",
"geq",
"ges",
"gev",
"gew",
"gey",
"gez",
"gfk",
"gft",
"gga",
"ggb",
"ggd",
"gge",
"ggk",
"ggl",
"ggt",
"ggu",
"ggw",
"gha",
"ghk",
"ghl",
"ghn",
"gho",
"ghs",
"ght",
"gia",
"gic",
"gid",
"gil",
"gim",
"gin",
"gip",
"giq",
"gir",
"gis",
"git",
"giw",
"gix",
"giy",
"giz",
"gjm",
"gjn",
"gju",
"gke",
"gkn",
"gkp",
"gku",
"gla",
"glc",
"gld",
"gle",
"glg",
"glh",
"glj",
"glk",
"glo",
"glv",
"glw",
"gly",
"gma",
"gmb",
"gmh",
"gml",
"gmm",
"gmu",
"gmv",
"gmy",
"gna",
"gnc",
"gnd",
"gne",
"gng",
"gnh",
"gni",
"gnk",
"gnl",
"gnm",
"gnn",
"gno",
"gnq",
"gnr",
"gnu",
"goa",
"gob",
"goc",
"god",
"gof",
"gog",
"goh",
"goi",
"gol",
"gom",
"gon",
"gop",
"gor",
"gos",
"got",
"gou",
"gow",
"gox",
"gpa",
"gpe",
"gpn",
"gqa",
"gqi",
"gqn",
"gqr",
"gqu",
"gra",
"grb",
"grc",
"grd",
"grg",
"grh",
"gri",
"grn",
"gro",
"grq",
"grr",
"grs",
"grt",
"gru",
"grx",
"gsl",
"gsn",
"gso",
"gsp",
"gsw",
"gta",
"gtu",
"gua",
"gub",
"guc",
"gud",
"gue",
"guf",
"gug",
"guh",
"gui",
"guj",
"guk",
"gul",
"gum",
"gun",
"guo",
"gup",
"guq",
"gur",
"gut",
"guu",
"gux",
"guz",
"gva",
"gvc",
"gve",
"gvf",
"gvj",
"gvl",
"gvm",
"gvn",
"gvo",
"gvp",
"gvr",
"gvs",
"gwa",
"gwc",
"gwd",
"gwe",
"gwf",
"gwi",
"gwj",
"gwn",
"gwr",
"gwt",
"gwu",
"gww",
"gwx",
"gxx",
"gya",
"gyb",
"gyd",
"gye",
"gyl",
"gym",
"gyn",
"gyo",
"gyr",
"gyy",
"gza",
"gzi",
"gzn",
"haa",
"hac",
"had",
"hae",
"hag",
"hah",
"hai",
"haj",
"hak",
"hal",
"ham",
"han",
"hao",
"haq",
"har",
"has",
"hat",
"hau",
"hav",
"haw",
"hax",
"hay",
"haz",
"hbb",
"hbn",
"hbo",
"hbs",
"hbu",
"hca",
"hch",
"hdn",
"hdy",
"hea",
"heb",
"hed",
"heg",
"heh",
"hei",
"hem",
"her",
"hgm",
"hia",
"hib",
"hid",
"hif",
"hig",
"hih",
"hik",
"hil",
"hin",
"hio",
"hit",
"hiw",
"hix",
"hka",
"hke",
"hkk",
"hla",
"hlb",
"hlu",
"hmb",
"hmd",
"hml",
"hmn",
"hmo",
"hmq",
"hmr",
"hms",
"hmt",
"hmu",
"hmv",
"hna",
"hnd",
"hne",
"hng",
"hnh",
"hni",
"hnj",
"hnn",
"hno",
"hns",
"hnu",
"hoa",
"hob",
"hoc",
"hod",
"hoe",
"hoh",
"hoi",
"hoj",
"hol",
"hoo",
"hop",
"hot",
"hov",
"how",
"hoz",
"hpo",
"hra",
"hrc",
"hre",
"hrk",
"hro",
"hrt",
"hru",
"hrv",
"hrw",
"hrx",
"hrz",
"hsb",
"hsn",
"hss",
"hti",
"hto",
"hts",
"htu",
"htx",
"hub",
"huc",
"hud",
"hue",
"huf",
"hug",
"huh",
"hui",
"huk",
"hul",
"hum",
"hun",
"huo",
"hup",
"huq",
"hur",
"hus",
"hut",
"huu",
"huv",
"huw",
"hux",
"huz",
"hvc",
"hve",
"hvn",
"hvv",
"hwa",
"hwc",
"hwo",
"hya",
"hye",
"hyw",
"iai",
"ian",
"iar",
"iba",
"ibb",
"ibd",
"ibe",
"ibg",
"ibl",
"ibn",
"ibo",
"ibr",
"ibu",
"iby",
"ica",
"ich",
"ida",
"idb",
"idc",
"idd",
"idi",
"ido",
"idr",
"idt",
"idu",
"ifa",
"ifb",
"ife",
"iff",
"ifk",
"ifm",
"ifu",
"ify",
"igb",
"ige",
"igl",
"igm",
"ign",
"igo",
"igs",
"ihp",
"iii",
"iin",
"ijc",
"ije",
"ijj",
"ijn",
"ijs",
"ike",
"iki",
"ikl",
"iko",
"ikt",
"iku",
"ikw",
"ikx",
"ikz",
"ilb",
"ile",
"ilk",
"ilo",
"ilp",
"ilu",
"ilv",
"imi",
"iml",
"imn",
"imo",
"imr",
"ims",
"ina",
"inb",
"ind",
"ing",
"inh",
"inj",
"inm",
"inn",
"ino",
"inp",
"int",
"inz",
"ior",
"iou",
"iow",
"ipi",
"ipk",
"ipo",
"iqu",
"ire",
"irh",
"iri",
"irk",
"irn",
"iru",
"irx",
"iry",
"isa",
"isc",
"isd",
"isi",
"isk",
"isl",
"iso",
"ist",
"isu",
"ita",
"itb",
"itd",
"ite",
"iti",
"itk",
"itl",
"itm",
"ito",
"itr",
"itv",
"itw",
"itx",
"itz",
"ium",
"ivb",
"ivv",
"iwm",
"iws",
"ixc",
"ixl",
"iyo",
"iyx",
"izh",
"izr",
"izz",
"jaa",
"jab",
"jac",
"jae",
"jaf",
"jah",
"jaj",
"jak",
"jal",
"jam",
"jao",
"jaq",
"jau",
"jav",
"jax",
"jay",
"jaz",
"jbe",
"jbi",
"jbj",
"jbk",
"jbn",
"jbo",
"jbt",
"jbu",
"jct",
"jdt",
"jeb",
"jee",
"jeh",
"jei",
"jek",
"jel",
"jen",
"jer",
"jet",
"jeu",
"jge",
"jgk",
"jgo",
"jhi",
"jia",
"jib",
"jic",
"jid",
"jig",
"jii",
"jil",
"jim",
"jio",
"jit",
"jiu",
"jiv",
"jjr",
"jka",
"jkm",
"jko",
"jkp",
"jku",
"jle",
"jma",
"jmb",
"jmc",
"jmd",
"jmi",
"jml",
"jmr",
"jms",
"jmx",
"jna",
"jng",
"jni",
"jnj",
"jns",
"job",
"jod",
"jor",
"jow",
"jpa",
"jpn",
"jqr",
"jra",
"jrb",
"jrr",
"jru",
"jua",
"jub",
"juc",
"jud",
"juh",
"juk",
"jul",
"jum",
"jun",
"juo",
"jup",
"jur",
"jut",
"juu",
"juw",
"juy",
"jvn",
"jya",
"kaa",
"kab",
"kac",
"kad",
"kae",
"kag",
"kah",
"kai",
"kaj",
"kak",
"kal",
"kam",
"kan",
"kao",
"kap",
"kaq",
"kas",
"kat",
"kau",
"kav",
"kaw",
"kay",
"kaz",
"kbb",
"kbc",
"kbd",
"kbe",
"kbh",
"kbi",
"kbj",
"kbk",
"kbl",
"kbm",
"kbn",
"kbo",
"kbp",
"kbq",
"kbr",
"kbs",
"kbt",
"kbv",
"kbw",
"kbx",
"kby",
"kbz",
"kca",
"kcb",
"kcd",
"kcf",
"kcg",
"kci",
"kcj",
"kck",
"kcl",
"kcm",
"kcn",
"kco",
"kcp",
"kcr",
"kct",
"kcu",
"kcv",
"kcw",
"kcx",
"kcy",
"kda",
"kdc",
"kdd",
"kde",
"kdf",
"kdh",
"kdi",
"kdj",
"kdk",
"kdl",
"kdm",
"kdn",
"kdp",
"kdq",
"kdr",
"kdt",
"kdu",
"kdw",
"kdx",
"kdz",
"kea",
"keb",
"kec",
"ked",
"kee",
"kef",
"keg",
"kei",
"kek",
"kel",
"kem",
"ken",
"keo",
"ker",
"kes",
"ket",
"keu",
"kew",
"kex",
"kez",
"kfa",
"kfb",
"kfc",
"kfd",
"kfe",
"kff",
"kfi",
"kfk",
"kfm",
"kfn",
"kfo",
"kfp",
"kfq",
"kfr",
"kfy",
"kfz",
"kga",
"kgb",
"kge",
"kgf",
"kgg",
"kgj",
"kgk",
"kgl",
"kgo",
"kgp",
"kgq",
"kgr",
"kgs",
"kgt",
"kgu",
"kgv",
"kgw",
"kgx",
"kgy",
"kha",
"khb",
"khc",
"khe",
"khf",
"khg",
"khh",
"khj",
"khk",
"khl",
"khm",
"khn",
"kho",
"khp",
"khq",
"khr",
"khs",
"kht",
"khu",
"khv",
"khw",
"khy",
"khz",
"kia",
"kib",
"kic",
"kid",
"kie",
"kig",
"kih",
"kii",
"kij",
"kik",
"kil",
"kim",
"kin",
"kio",
"kiq",
"kir",
"kis",
"kit",
"kiu",
"kiv",
"kiw",
"kiy",
"kiz",
"kja",
"kjb",
"kjc",
"kjd",
"kje",
"kjg",
"kjh",
"kji",
"kjj",
"kjk",
"kjl",
"kjm",
"kjn",
"kjp",
"kjq",
"kjr",
"kjs",
"kju",
"kjx",
"kjy",
"kjz",
"kka",
"kkb",
"kkc",
"kke",
"kkh",
"kki",
"kkj",
"kkk",
"kkl",
"kkm",
"kko",
"kkp",
"kks",
"kkw",
"kky",
"kkz",
"kla",
"klb",
"klc",
"kld",
"kle",
"klg",
"klh",
"kli",
"klj",
"kll",
"klm",
"kln",
"klp",
"klq",
"klr",
"kls",
"klt",
"klu",
"klv",
"klw",
"klx",
"kly",
"klz",
"kma",
"kmb",
"kmc",
"kmd",
"kmf",
"kmg",
"kmh",
"kmi",
"kmk",
"kml",
"kmm",
"kmn",
"kmo",
"kmq",
"kmr",
"kms",
"kmt",
"kmu",
"kmv",
"kmw",
"kmx",
"kmz",
"kna",
"knb",
"knc",
"knd",
"kne",
"knf",
"kng",
"kni",
"knj",
"knk",
"knm",
"knn",
"kno",
"knp",
"knq",
"knr",
"kns",
"knt",
"knv",
"knw",
"knx",
"kny",
"kod",
"koe",
"kof",
"kog",
"koh",
"koi",
"kok",
"kol",
"kon",
"koo",
"kop",
"koq",
"kor",
"kos",
"kot",
"kou",
"kov",
"koy",
"koz",
"kpc",
"kpe",
"kpf",
"kpg",
"kph",
"kpj",
"kpk",
"kpl",
"kpm",
"kpn",
"kpo",
"kpq",
"kpr",
"kps",
"kpt",
"kpu",
"kpv",
"kpw",
"kpx",
"kpy",
"kpz",
"kqa",
"kqb",
"kqc",
"kqe",
"kqf",
"kqi",
"kqj",
"kql",
"kqn",
"kqo",
"kqq",
"kqr",
"kqs",
"kqt",
"kqu",
"kqv",
"kqw",
"kqx",
"kqy",
"kqz",
"krb",
"krc",
"krd",
"kre",
"krf",
"krh",
"kri",
"krj",
"krk",
"krl",
"krp",
"krr",
"krs",
"kru",
"krx",
"kry",
"krz",
"ksb",
"ksc",
"ksd",
"kse",
"ksf",
"ksg",
"ksh",
"ksi",
"ksj",
"ksk",
"ksl",
"ksm",
"ksn",
"ksp",
"ksq",
"ksr",
"kss",
"kst",
"ksv",
"ksw",
"ksx",
"ksz",
"ktb",
"ktd",
"kte",
"ktg",
"kth",
"kti",
"ktj",
"ktk",
"ktm",
"ktn",
"kto",
"ktp",
"kts",
"ktt",
"ktu",
"ktv",
"ktw",
"ktx",
"ktz",
"kua",
"kub",
"kud",
"kue",
"kug",
"kui",
"kuj",
"kul",
"kum",
"kun",
"kuo",
"kup",
"kuq",
"kus",
"kut",
"kuu",
"kux",
"kuy",
"kuz",
"kva",
"kvb",
"kvc",
"kvd",
"kve",
"kvf",
"kvg",
"kvh",
"kvj",
"kvm",
"kvn",
"kvo",
"kvq",
"kvr",
"kvu",
"kvv",
"kvw",
"kvy",
"kwa",
"kwb",
"kwd",
"kwe",
"kwf",
"kwg",
"kwh",
"kwi",
"kwj",
"kwk",
"kwl",
"kwn",
"kwo",
"kwp",
"kws",
"kwv",
"kwy",
"kwz",
"kxa",
"kxb",
"kxc",
"kxd",
"kxf",
"kxh",
"kxi",
"kxj",
"kxm",
"kxn",
"kxo",
"kxv",
"kxw",
"kxz",
"kya",
"kyc",
"kye",
"kyf",
"kyh",
"kyi",
"kyj",
"kyk",
"kyl",
"kym",
"kyo",
"kyq",
"kyr",
"kys",
"kyt",
"kyu",
"kyx",
"kyz",
"kza",
"kzb",
"kzc",
"kzd",
"kze",
"kzf",
"kzg",
"kzi",
"kzk",
"kzl",
"kzm",
"kzn",
"kzo",
"kzp",
"kzq",
"kzr",
"kzs",
"kzu",
"kzw",
"kzx",
"kzz",
"laa",
"lac",
"lad",
"lae",
"laf",
"lag",
"lah",
"lai",
"laj",
"lam",
"lan",
"lao",
"lap",
"laq",
"lar",
"las",
"lat",
"lau",
"law",
"lax",
"laz",
"lbb",
"lbc",
"lbe",
"lbf",
"lbj",
"lbk",
"lbl",
"lbn",
"lbo",
"lbq",
"lbr",
"lbt",
"lbu",
"lbv",
"lbw",
"lbx",
"lby",
"lbz",
"lcc",
"lch",
"lcm",
"lcp",
"lcq",
"lcs",
"lda",
"ldb",
"ldi",
"ldj",
"ldk",
"ldl",
"ldm",
"ldn",
"lea",
"leb",
"lec",
"led",
"lee",
"lef",
"leh",
"lei",
"lej",
"lek",
"lem",
"len",
"lep",
"leq",
"ler",
"les",
"let",
"leu",
"lev",
"lew",
"lex",
"lez",
"lfa",
"lfn",
"lga",
"lgb",
"lgg",
"lgh",
"lgi",
"lgk",
"lgl",
"lgn",
"lgq",
"lgr",
"lgt",
"lgu",
"lha",
"lhh",
"lhi",
"lhm",
"lhn",
"lhs",
"lht",
"lhu",
"lia",
"lib",
"lic",
"lid",
"lif",
"lig",
"lih",
"lij",
"lil",
"lim",
"lin",
"lio",
"lip",
"liq",
"lir",
"lis",
"lit",
"liu",
"liv",
"lix",
"liy",
"liz",
"lja",
"lje",
"lji",
"ljl",
"ljp",
"lka",
"lkb",
"lkd",
"lke",
"lkh",
"lki",
"lkl",
"lkn",
"lkr",
"lkt",
"lku",
"lky",
"lla",
"llc",
"lld",
"lle",
"llf",
"llg",
"lli",
"llk",
"lll",
"llm",
"lln",
"llp",
"llu",
"lma",
"lmb",
"lmc",
"lme",
"lmg",
"lmh",
"lmk",
"lml",
"lmn",
"lmo",
"lmp",
"lmr",
"lmu",
"lmw",
"lmx",
"lmy",
"lna",
"lnb",
"lnd",
"lnh",
"lnj",
"lnl",
"lnn",
"lns",
"lnu",
"loa",
"lob",
"loc",
"loe",
"lof",
"log",
"loh",
"loi",
"loj",
"lok",
"lol",
"lom",
"lon",
"loo",
"lop",
"lor",
"los",
"lot",
"lou",
"loy",
"loz",
"lpa",
"lpe",
"lra",
"lrc",
"lre",
"lrg",
"lrl",
"lro",
"lrv",
"lrz",
"lsa",
"lsd",
"lse",
"lsh",
"lsi",
"lsm",
"lsr",
"ltc",
"ltg",
"lti",
"ltn",
"ltz",
"lua",
"lub",
"luc",
"lud",
"lue",
"luf",
"lug",
"lui",
"luj",
"lul",
"lum",
"lun",
"luo",
"lup",
"luq",
"lur",
"lus",
"lut",
"luw",
"luy",
"lva",
"lvk",
"lvs",
"lwh",
"lwl",
"lwo",
"lww",
"lyg",
"lyn",
"lzh",
"lzl",
"lzz",
"maa",
"mad",
"mae",
"maf",
"mag",
"mah",
"mai",
"maj",
"mak",
"mal",
"mam",
"man",
"maq",
"mar",
"mas",
"mat",
"mau",
"mav",
"maw",
"max",
"maz",
"mbb",
"mbc",
"mbd",
"mbe",
"mbf",
"mbh",
"mbi",
"mbj",
"mbl",
"mbm",
"mbn",
"mbo",
"mbp",
"mbq",
"mbr",
"mbs",
"mbt",
"mbu",
"mbv",
"mbw",
"mbx",
"mbz",
"mca",
"mcb",
"mcc",
"mcd",
"mcf",
"mcg",
"mch",
"mci",
"mcj",
"mck",
"mcm",
"mcn",
"mco",
"mcp",
"mcq",
"mcr",
"mcs",
"mct",
"mcu",
"mcv",
"mcw",
"mcx",
"mcy",
"mcz",
"mda",
"mdb",
"mdc",
"mdd",
"mde",
"mdf",
"mdh",
"mdi",
"mdj",
"mdm",
"mdp",
"mdq",
"mdr",
"mds",
"mdt",
"mdv",
"mdw",
"mdx",
"mdy",
"mdz",
"mea",
"meb",
"mec",
"med",
"mee",
"mef",
"mei",
"mej",
"mek",
"mel",
"mem",
"men",
"meo",
"mep",
"meq",
"mer",
"met",
"meu",
"mev",
"mew",
"mey",
"mez",
"mfa",
"mfb",
"mfc",
"mfd",
"mfe",
"mff",
"mfg",
"mfh",
"mfi",
"mfj",
"mfl",
"mfm",
"mfn",
"mfo",
"mfp",
"mfq",
"mfr",
"mft",
"mfu",
"mfv",
"mfw",
"mfx",
"mfy",
"mfz",
"mga",
"mgb",
"mgc",
"mgd",
"mge",
"mgf",
"mgh",
"mgi",
"mgk",
"mgl",
"mgm",
"mgo",
"mgp",
"mgq",
"mgr",
"mgs",
"mgu",
"mgv",
"mgw",
"mgy",
"mgz",
"mha",
"mhb",
"mhc",
"mhd",
"mhe",
"mhf",
"mhg",
"mhi",
"mhj",
"mhk",
"mhl",
"mhn",
"mho",
"mhq",
"mhr",
"mhs",
"mht",
"mhu",
"mhw",
"mhx",
"mhy",
"mhz",
"mia",
"mib",
"mic",
"mid",
"mie",
"mif",
"mig",
"mih",
"mij",
"mik",
"mil",
"mim",
"min",
"mio",
"mip",
"miq",
"mir",
"mit",
"miu",
"miw",
"mix",
"mjc",
"mjd",
"mjg",
"mjh",
"mji",
"mjj",
"mjk",
"mjm",
"mjs",
"mjt",
"mjw",
"mjx",
"mjy",
"mka",
"mkc",
"mkd",
"mke",
"mkf",
"mkg",
"mki",
"mkj",
"mkl",
"mkm",
"mkn",
"mkp",
"mkq",
"mkr",
"mks",
"mkv",
"mkw",
"mky",
"mkz",
"mla",
"mlc",
"mle",
"mlf",
"mlg",
"mlh",
"mlj",
"mlk",
"mll",
"mlm",
"mln",
"mlp",
"mlq",
"mlr",
"mls",
"mlt",
"mlu",
"mlv",
"mlw",
"mlx",
"mma",
"mmb",
"mmd",
"mme",
"mmf",
"mmg",
"mmh",
"mmi",
"mmm",
"mmn",
"mmo",
"mmp",
"mmq",
"mmr",
"mmt",
"mmu",
"mmw",
"mmx",
"mmy",
"mmz",
"mna",
"mnb",
"mnc",
"mnd",
"mne",
"mnf",
"mng",
"mnh",
"mni",
"mnj",
"mnk",
"mnl",
"mnm",
"mnp",
"mnq",
"mnr",
"mns",
"mnv",
"mnw",
"mnx",
"mny",
"mnz",
"moa",
"moc",
"mod",
"moe",
"mog",
"moh",
"moi",
"moj",
"mok",
"mom",
"mop",
"moq",
"mor",
"mos",
"mot",
"mou",
"mov",
"mox",
"moy",
"moz",
"mpa",
"mpb",
"mpc",
"mpd",
"mpe",
"mpg",
"mph",
"mpi",
"mpj",
"mpk",
"mpl",
"mpm",
"mpn",
"mpp",
"mpq",
"mpr",
"mps",
"mpt",
"mpu",
"mpv",
"mpw",
"mpx",
"mpy",
"mpz",
"mqa",
"mqb",
"mqc",
"mqe",
"mqf",
"mqj",
"mqk",
"mql",
"mqm",
"mqn",
"mqo",
"mqp",
"mqr",
"mqs",
"mqu",
"mqv",
"mqw",
"mqx",
"mqy",
"mqz",
"mra",
"mrb",
"mrc",
"mrf",
"mrg",
"mrh",
"mri",
"mrj",
"mrk",
"mrl",
"mrm",
"mrn",
"mro",
"mrp",
"mrq",
"mrr",
"mrs",
"mrt",
"mru",
"mrv",
"mrw",
"mrx",
"mry",
"mrz",
"msa",
"msb",
"mse",
"msf",
"msg",
"msh",
"msj",
"msk",
"msl",
"msm",
"msn",
"mso",
"mss",
"msu",
"msw",
"msx",
"msy",
"msz",
"mta",
"mtc",
"mtd",
"mte",
"mtf",
"mtg",
"mth",
"mti",
"mtj",
"mtk",
"mtl",
"mtm",
"mtn",
"mto",
"mtp",
"mtq",
"mtr",
"mts",
"mtt",
"mtu",
"mtv",
"mty",
"mua",
"mub",
"muc",
"mue",
"mug",
"muh",
"mui",
"muk",
"mul",
"muo",
"mup",
"mur",
"mus",
"muu",
"mux",
"muy",
"muz",
"mva",
"mvb",
"mvd",
"mve",
"mvf",
"mvi",
"mvn",
"mvo",
"mvp",
"mvq",
"mvr",
"mvt",
"mvu",
"mvv",
"mvx",
"mvy",
"mvz",
"mwa",
"mwb",
"mwc",
"mwe",
"mwf",
"mwg",
"mwh",
"mwi",
"mwl",
"mwm",
"mwn",
"mwo",
"mwp",
"mwq",
"mwr",
"mwt",
"mwv",
"mww",
"mxb",
"mxc",
"mxd",
"mxe",
"mxf",
"mxg",
"mxi",
"mxj",
"mxk",
"mxm",
"mxn",
"mxp",
"mxq",
"mxr",
"mxt",
"mxu",
"mxv",
"mxw",
"mxx",
"mxy",
"mxz",
"mya",
"myb",
"mye",
"myf",
"myg",
"myh",
"myj",
"myk",
"myl",
"mym",
"myo",
"myp",
"myr",
"mys",
"myu",
"myv",
"myw",
"myx",
"myy",
"myz",
"mzb",
"mzd",
"mze",
"mzh",
"mzi",
"mzj",
"mzk",
"mzm",
"mzn",
"mzo",
"mzp",
"mzq",
"mzr",
"mzs",
"mzt",
"mzu",
"mzv",
"mzw",
"mzz",
"nab",
"nac",
"nae",
"naf",
"nag",
"naj",
"nak",
"nal",
"nam",
"nap",
"naq",
"nar",
"nas",
"nat",
"nau",
"nav",
"naw",
"nax",
"nay",
"naz",
"nba",
"nbb",
"nbc",
"nbe",
"nbh",
"nbi",
"nbj",
"nbk",
"nbl",
"nbm",
"nbn",
"nbo",
"nbp",
"nbq",
"nbr",
"nbu",
"nbv",
"nbw",
"nby",
"nca",
"ncb",
"ncc",
"ncd",
"nce",
"ncf",
"ncg",
"nch",
"nci",
"ncj",
"nck",
"ncl",
"ncm",
"ncn",
"nco",
"ncr",
"nct",
"ncu",
"ncx",
"ncz",
"nda",
"ndb",
"ndc",
"ndd",
"nde",
"ndg",
"ndh",
"ndi",
"ndj",
"ndm",
"ndn",
"ndo",
"ndq",
"ndr",
"nds",
"ndt",
"ndu",
"ndv",
"ndx",
"ndy",
"ndz",
"neb",
"nec",
"nee",
"neg",
"neh",
"nej",
"nem",
"nen",
"neo",
"neq",
"ner",
"net",
"neu",
"nev",
"new",
"ney",
"nez",
"nfa",
"nfd",
"nfl",
"nfr",
"nfu",
"nga",
"ngb",
"ngc",
"nge",
"ngg",
"ngh",
"ngi",
"ngj",
"ngk",
"ngl",
"ngn",
"ngp",
"ngq",
"ngr",
"ngs",
"ngt",
"ngu",
"ngv",
"ngx",
"ngy",
"ngz",
"nha",
"nhb",
"nhc",
"nhd",
"nhe",
"nhf",
"nhg",
"nhi",
"nhm",
"nhn",
"nho",
"nhp",
"nhq",
"nhr",
"nht",
"nhu",
"nhv",
"nhw",
"nhx",
"nhy",
"nia",
"nib",
"nid",
"nie",
"nif",
"nig",
"nih",
"nii",
"nij",
"nil",
"nim",
"nin",
"nio",
"niq",
"nir",
"nis",
"nit",
"niu",
"niv",
"niw",
"niy",
"niz",
"nja",
"njb",
"njh",
"nji",
"njj",
"njl",
"njm",
"njn",
"njo",
"njr",
"nju",
"njy",
"njz",
"nka",
"nkb",
"nkc",
"nkd",
"nke",
"nkf",
"nkg",
"nkh",
"nki",
"nkj",
"nkk",
"nko",
"nkp",
"nkr",
"nks",
"nku",
"nkw",
"nkx",
"nkz",
"nla",
"nlc",
"nld",
"nle",
"nlg",
"nli",
"nlj",
"nll",
"nlo",
"nlu",
"nlx",
"nly",
"nlz",
"nma",
"nmb",
"nmc",
"nmd",
"nme",
"nmf",
"nmg",
"nmh",
"nmj",
"nmk",
"nml",
"nmm",
"nmn",
"nmp",
"nms",
"nmt",
"nmu",
"nmv",
"nmw",
"nmx",
"nmy",
"nmz",
"nna",
"nnb",
"nnc",
"nnd",
"nnf",
"nng",
"nnh",
"nnj",
"nnk",
"nnl",
"nnm",
"nno",
"nnp",
"nnq",
"nnr",
"nnt",
"nnu",
"nnv",
"nnw",
"nny",
"nnz",
"noa",
"nob",
"noc",
"nod",
"noe",
"nog",
"noh",
"noj",
"nok",
"nol",
"nom",
"non",
"nop",
"noq",
"nor",
"not",
"nou",
"nov",
"now",
"noz",
"nph",
"npi",
"npl",
"npn",
"npo",
"nps",
"npy",
"nqg",
"nqk",
"nqm",
"nqo",
"nra",
"nrb",
"nre",
"nrf",
"nrg",
"nri",
"nrk",
"nrl",
"nrm",
"nrn",
"nrt",
"nru",
"nrz",
"nsa",
"nsb",
"nse",
"nsg",
"nsk",
"nsm",
"nsn",
"nso",
"nsq",
"nss",
"nst",
"nsu",
"nsw",
"nsx",
"nsy",
"nsz",
"nti",
"ntj",
"ntk",
"ntm",
"nto",
"ntp",
"ntu",
"ntw",
"nty",
"ntz",
"nua",
"nuc",
"nud",
"nuf",
"nug",
"nui",
"nuj",
"nuk",
"nul",
"num",
"nun",
"nuo",
"nup",
"nur",
"nus",
"nut",
"nuw",
"nux",
"nuy",
"nuz",
"nvh",
"nvo",
"nwa",
"nwb",
"nwe",
"nwi",
"nwr",
"nwy",
"nxa",
"nxe",
"nxg",
"nxi",
"nxl",
"nxn",
"nxq",
"nxr",
"nxx",
"nya",
"nyb",
"nyd",
"nye",
"nyf",
"nyh",
"nyi",
"nyj",
"nyk",
"nyl",
"nym",
"nyn",
"nyo",
"nyp",
"nyq",
"nys",
"nyt",
"nyu",
"nyv",
"nyx",
"nyy",
"nza",
"nzb",
"nzd",
"nzi",
"nzk",
"nzm",
"nzy",
"nzz",
"oaa",
"oac",
"oar",
"obi",
"obk",
"obm",
"obo",
"obr",
"obt",
"oca",
"och",
"oci",
"oco",
"ocu",
"odt",
"odu",
"ofo",
"ofs",
"ofu",
"ogb",
"ogc",
"oge",
"ogo",
"ogu",
"oht",
"oia",
"ojb",
"ojc",
"ojg",
"oji",
"ojp",
"ojs",
"ojv",
"ojw",
"oka",
"okb",
"okd",
"oke",
"oki",
"okj",
"okn",
"oko",
"okr",
"oks",
"oku",
"okv",
"ola",
"old",
"ole",
"olm",
"olo",
"olt",
"oma",
"omb",
"omc",
"omg",
"omk",
"oml",
"omo",
"omr",
"omt",
"omu",
"omw",
"omx",
"ona",
"onb",
"one",
"ong",
"oni",
"onj",
"onn",
"ono",
"onp",
"ons",
"onu",
"onw",
"ood",
"oog",
"oon",
"opa",
"opk",
"opm",
"opo",
"opt",
"opy",
"ora",
"orc",
"ore",
"org",
"orh",
"ori",
"orm",
"oro",
"orr",
"ort",
"oru",
"orv",
"orw",
"orx",
"ory",
"orz",
"osa",
"osc",
"osi",
"osp",
"oss",
"osx",
"ota",
"otd",
"ote",
"oti",
"otk",
"otl",
"otm",
"otq",
"otr",
"ots",
"ott",
"otu",
"otw",
"otz",
"oua",
"oub",
"oum",
"ovd",
"owi",
"owl",
"oyb",
"oyd",
"oym",
"ozm",
"pab",
"pac",
"pad",
"pae",
"paf",
"pag",
"pah",
"pai",
"pak",
"pal",
"pam",
"pan",
"pao",
"pap",
"paq",
"par",
"pas",
"pau",
"pav",
"paw",
"pay",
"paz",
"pbb",
"pbe",
"pbf",
"pbg",
"pbh",
"pbi",
"pbn",
"pbo",
"pbp",
"pbr",
"pbs",
"pbt",
"pbu",
"pbv",
"pby",
"pca",
"pcb",
"pcc",
"pcd",
"pce",
"pci",
"pcj",
"pck",
"pcm",
"pcn",
"pcp",
"pda",
"pdc",
"pdo",
"pdt",
"pdu",
"peb",
"ped",
"pee",
"pef",
"peg",
"peh",
"pei",
"pej",
"pek",
"pem",
"peo",
"pep",
"peq",
"pes",
"pex",
"pez",
"pfe",
"pfl",
"pga",
"pgd",
"pgk",
"pgn",
"pgu",
"pha",
"phg",
"phk",
"phl",
"phn",
"pho",
"pht",
"pia",
"pib",
"pic",
"pid",
"pie",
"pif",
"pih",
"pij",
"pil",
"pim",
"pin",
"pio",
"pip",
"pir",
"pis",
"pit",
"piu",
"piv",
"piw",
"pix",
"piz",
"pjt",
"pka",
"pkb",
"pkc",
"pkg",
"pkh",
"pkn",
"pko",
"pkp",
"pkt",
"pku",
"pla",
"plb",
"plc",
"ple",
"plg",
"plh",
"pli",
"plj",
"plk",
"pll",
"pln",
"plo",
"plq",
"plr",
"pls",
"plt",
"plu",
"plv",
"plw",
"ply",
"plz",
"pma",
"pmb",
"pmf",
"pmh",
"pmi",
"pmk",
"pml",
"pmm",
"pmn",
"pmo",
"pmq",
"pmr",
"pms",
"pmt",
"pmw",
"pmy",
"pmz",
"pna",
"pnb",
"pne",
"png",
"pnh",
"pni",
"pnj",
"pnn",
"pno",
"pnp",
"pnq",
"pnr",
"pns",
"pnt",
"pnu",
"pnv",
"pnw",
"pny",
"poc",
"poe",
"pof",
"pog",
"poh",
"poi",
"pok",
"pol",
"pom",
"pon",
"poo",
"poq",
"por",
"pos",
"pot",
"pov",
"pow",
"pox",
"poy",
"ppe",
"ppi",
"ppk",
"ppl",
"ppm",
"ppn",
"ppo",
"ppq",
"ppt",
"ppu",
"pqa",
"pqm",
"prc",
"prd",
"pre",
"prf",
"prg",
"prh",
"pri",
"prk",
"prm",
"prn",
"pro",
"prq",
"prr",
"prs",
"pru",
"prw",
"prx",
"psa",
"pse",
"psh",
"psi",
"psm",
"psn",
"psq",
"pss",
"pst",
"psu",
"psw",
"pta",
"pth",
"pti",
"ptn",
"pto",
"ptp",
"ptr",
"ptu",
"ptv",
"pua",
"pub",
"puc",
"pue",
"pui",
"puj",
"pum",
"puo",
"pup",
"puq",
"pur",
"puu",
"puw",
"puy",
"pwa",
"pwg",
"pwi",
"pwm",
"pwn",
"pwo",
"pym",
"pyn",
"pyu",
"pyy",
"qua",
"qub",
"quc",
"que",
"quf",
"qug",
"quh",
"qui",
"quk",
"qul",
"qum",
"qun",
"qup",
"qus",
"quv",
"quw",
"qux",
"quy",
"quz",
"qva",
"qvc",
"qve",
"qvi",
"qvl",
"qvn",
"qvo",
"qvs",
"qvw",
"qvy",
"qvz",
"qwa",
"qwh",
"qwm",
"qwt",
"qxa",
"qxn",
"qxo",
"qxp",
"qxq",
"qxr",
"qxs",
"qxu",
"qxw",
"qya",
"qyp",
"raa",
"rab",
"rac",
"rad",
"raf",
"rag",
"rah",
"rai",
"raj",
"rak",
"ral",
"ram",
"ran",
"rao",
"rap",
"rar",
"ras",
"rau",
"rav",
"raw",
"ray",
"raz",
"rbb",
"rbp",
"rcf",
"rea",
"reb",
"ree",
"reg",
"rej",
"rel",
"rem",
"ren",
"res",
"rey",
"rga",
"rgk",
"rgn",
"rgr",
"rgs",
"rhg",
"rhp",
"rif",
"ril",
"rim",
"rin",
"rir",
"rit",
"riu",
"rji",
"rjs",
"rkb",
"rkh",
"rki",
"rkm",
"rma",
"rmb",
"rmc",
"rme",
"rmf",
"rmg",
"rmh",
"rmi",
"rmk",
"rml",
"rmm",
"rmn",
"rmo",
"rmp",
"rmq",
"rmt",
"rmu",
"rmv",
"rmw",
"rmy",
"rnd",
"rng",
"rnn",
"rnp",
"rnr",
"rnw",
"rob",
"rod",
"roe",
"rof",
"rog",
"roh",
"rol",
"rom",
"ron",
"roo",
"rop",
"rou",
"rpn",
"rpt",
"rri",
"rro",
"rsb",
"rth",
"rtm",
"rtw",
"rub",
"rue",
"ruf",
"rug",
"rui",
"ruk",
"run",
"ruo",
"rup",
"ruq",
"rus",
"rut",
"ruu",
"rwa",
"rwk",
"rwm",
"rwo",
"rwr",
"ryn",
"rys",
"ryu",
"sab",
"sac",
"sad",
"sae",
"saf",
"sag",
"sah",
"saj",
"sak",
"sam",
"san",
"sao",
"saq",
"sar",
"sas",
"sat",
"sau",
"sav",
"saw",
"sax",
"say",
"saz",
"sba",
"sbb",
"sbc",
"sbd",
"sbe",
"sbf",
"sbg",
"sbh",
"sbi",
"sbk",
"sbl",
"sbm",
"sbo",
"sbp",
"sbq",
"sbr",
"sbs",
"sbt",
"sbu",
"sbv",
"sbw",
"scb",
"sce",
"scg",
"sci",
"sck",
"scl",
"scn",
"sco",
"scp",
"scq",
"scs",
"scu",
"scv",
"scw",
"sda",
"sdc",
"sde",
"sdg",
"sdh",
"sdj",
"sdn",
"sdo",
"sdp",
"sdq",
"sds",
"sdt",
"sdz",
"sea",
"sec",
"sed",
"see",
"sef",
"seh",
"sei",
"sej",
"sek",
"sel",
"sen",
"seo",
"seq",
"ser",
"ses",
"set",
"seu",
"sev",
"sew",
"sey",
"sga",
"sgc",
"sgd",
"sge",
"sgh",
"sgi",
"sgm",
"sgp",
"sgr",
"sgs",
"sgt",
"sgw",
"sgy",
"sgz",
"sha",
"shb",
"she",
"shg",
"shh",
"shi",
"shj",
"shk",
"shl",
"shn",
"sho",
"shp",
"shq",
"shr",
"shs",
"sht",
"shu",
"shv",
"shw",
"shx",
"shy",
"shz",
"sia",
"sib",
"sid",
"sie",
"sif",
"sig",
"sii",
"sij",
"sik",
"sil",
"sim",
"sin",
"sip",
"siq",
"sir",
"sis",
"siu",
"siw",
"six",
"siy",
"siz",
"sja",
"sjd",
"sje",
"sjg",
"sjk",
"sjl",
"sjm",
"sjn",
"sjo",
"sjr",
"sjs",
"sjt",
"sju",
"sjw",
"ska",
"skb",
"skc",
"skd",
"ske",
"skf",
"skg",
"skh",
"ski",
"skj",
"skm",
"skr",
"sks",
"skt",
"sku",
"skv",
"skw",
"skx",
"sky",
"skz",
"slc",
"slg",
"slh",
"sli",
"slj",
"slk",
"sll",
"slm",
"sln",
"slp",
"slr",
"slu",
"slv",
"slw",
"sly",
"slz",
"sma",
"smb",
"smc",
"sme",
"smg",
"smj",
"smk",
"sml",
"smn",
"smo",
"smp",
"smq",
"smr",
"sms",
"smw",
"smx",
"smy",
"sna",
"snc",
"snd",
"sne",
"snf",
"sng",
"sni",
"snk",
"snl",
"snn",
"sno",
"snp",
"snq",
"snr",
"sns",
"snu",
"snv",
"snw",
"snx",
"sny",
"snz",
"sob",
"soc",
"sod",
"soe",
"sog",
"soh",
"sok",
"sol",
"som",
"sop",
"soq",
"sor",
"sos",
"sot",
"sou",
"sov",
"sow",
"soz",
"spa",
"spb",
"spc",
"spd",
"spe",
"spi",
"spl",
"spm",
"spn",
"spo",
"spp",
"spr",
"sps",
"spu",
"spx",
"spy",
"sqa",
"sqi",
"sqn",
"sqo",
"sqq",
"sqr",
"sqt",
"squ",
"sra",
"srb",
"src",
"srd",
"srf",
"srh",
"sri",
"srk",
"srl",
"srm",
"srn",
"sro",
"srp",
"srq",
"srr",
"srs",
"sru",
"srv",
"srw",
"sry",
"srz",
"ssb",
"ssc",
"ssd",
"sse",
"ssf",
"ssg",
"ssj",
"ssk",
"ssl",
"ssm",
"sso",
"ssp",
"ssq",
"sss",
"sst",
"ssu",
"ssv",
"ssw",
"ssx",
"ssy",
"ssz",
"ste",
"stf",
"stg",
"sth",
"sti",
"stj",
"stk",
"stl",
"stn",
"sto",
"stp",
"stq",
"str",
"sts",
"stu",
"stv",
"stw",
"sua",
"sub",
"suc",
"sue",
"sug",
"sui",
"suj",
"suk",
"sun",
"suq",
"sur",
"sus",
"sut",
"suv",
"suw",
"sux",
"suy",
"suz",
"sva",
"svb",
"svc",
"sve",
"svs",
"swb",
"swc",
"swe",
"swf",
"swg",
"swh",
"swi",
"swj",
"swm",
"swn",
"swo",
"swp",
"swq",
"sws",
"swt",
"swu",
"swv",
"sww",
"swx",
"swy",
"sxb",
"sxe",
"sxg",
"sxm",
"sxn",
"sxr",
"sxu",
"sya",
"syb",
"syc",
"syi",
"syk",
"syl",
"sym",
"syn",
"syr",
"sys",
"syw",
"sza",
"szb",
"szc",
"szd",
"sze",
"szg",
"szl",
"szn",
"szp",
"szv",
"szw",
"taa",
"tab",
"tac",
"tad",
"tae",
"taf",
"tag",
"tah",
"taj",
"tal",
"tam",
"tan",
"tao",
"tap",
"taq",
"tar",
"tas",
"tat",
"tau",
"tav",
"tay",
"taz",
"tba",
"tbc",
"tbd",
"tbe",
"tbf",
"tbg",
"tbh",
"tbi",
"tbj",
"tbk",
"tbl",
"tbm",
"tbn",
"tbo",
"tbp",
"tbr",
"tbs",
"tbt",
"tbu",
"tbv",
"tbw",
"tbx",
"tby",
"tbz",
"tca",
"tcb",
"tcc",
"tcd",
"tce",
"tcf",
"tcg",
"tci",
"tck",
"tcl",
"tcm",
"tcn",
"tcq",
"tcs",
"tct",
"tcx",
"tcy",
"tcz",
"tda",
"tdc",
"tdd",
"tde",
"tdf",
"tdg",
"tdh",
"tdi",
"tdj",
"tdk",
"tdl",
"tdm",
"tdn",
"tdo",
"tdr",
"tds",
"tdt",
"tdv",
"tdx",
"tdy",
"tea",
"tec",
"ted",
"tee",
"teg",
"teh",
"tei",
"tek",
"tel",
"tem",
"ten",
"teo",
"tep",
"teq",
"ter",
"tes",
"tet",
"teu",
"tev",
"tew",
"tex",
"tey",
"tfn",
"tfr",
"tft",
"tgb",
"tgc",
"tgd",
"tgh",
"tgi",
"tgj",
"tgk",
"tgl",
"tgn",
"tgo",
"tgp",
"tgq",
"tgr",
"tgs",
"tgt",
"tgu",
"tgw",
"tgx",
"tgy",
"tha",
"thd",
"the",
"thf",
"thk",
"thl",
"thm",
"thp",
"thq",
"thr",
"ths",
"tht",
"thv",
"thz",
"tia",
"tic",
"tif",
"tig",
"tih",
"tii",
"tij",
"tik",
"til",
"tim",
"tin",
"tio",
"tip",
"tiq",
"tir",
"tis",
"tit",
"tiu",
"tiv",
"tiw",
"tix",
"tiy",
"tiz",
"tjg",
"tji",
"tjm",
"tjs",
"tju",
"tjw",
"tkd",
"tke",
"tkg",
"tkl",
"tkm",
"tkn",
"tkp",
"tkq",
"tkr",
"tks",
"tku",
"tkw",
"tlb",
"tlc",
"tld",
"tlf",
"tlg",
"tlh",
"tli",
"tlj",
"tlk",
"tll",
"tlm",
"tlo",
"tlp",
"tlq",
"tlr",
"tls",
"tlt",
"tlu",
"tlv",
"tlx",
"tly",
"tma",
"tmb",
"tmc",
"tmd",
"tmf",
"tmi",
"tmj",
"tmm",
"tmn",
"tmo",
"tmq",
"tmr",
"tms",
"tmt",
"tmu",
"tmw",
"tmy",
"tmz",
"tna",
"tnc",
"tnh",
"tni",
"tnk",
"tnl",
"tnm",
"tnn",
"tno",
"tnp",
"tnq",
"tnr",
"tnt",
"tnw",
"tnx",
"tnz",
"tob",
"toc",
"tof",
"tog",
"toh",
"toi",
"toj",
"tol",
"tom",
"ton",
"too",
"top",
"tos",
"tow",
"tox",
"tpa",
"tpc",
"tpe",
"tpf",
"tpg",
"tpi",
"tpj",
"tpl",
"tpm",
"tpn",
"tpp",
"tpr",
"tpt",
"tpu",
"tpw",
"tpx",
"tpy",
"tpz",
"tqb",
"tql",
"tqo",
"tqp",
"tqq",
"tqu",
"tqw",
"tra",
"trc",
"trd",
"trf",
"trg",
"trh",
"tri",
"trm",
"trn",
"tro",
"trp",
"trq",
"trr",
"trs",
"trt",
"tru",
"trv",
"trw",
"trx",
"trz",
"tsa",
"tsb",
"tsc",
"tsd",
"tsg",
"tsh",
"tsi",
"tsj",
"tsk",
"tsl",
"tsn",
"tso",
"tsr",
"tsu",
"tsv",
"tsw",
"tsx",
"tsz",
"tta",
"ttc",
"ttd",
"tte",
"ttf",
"ttg",
"tti",
"ttj",
"ttk",
"ttm",
"ttn",
"ttq",
"ttr",
"tts",
"ttt",
"ttu",
"ttv",
"ttw",
"tty",
"tua",
"tub",
"tuc",
"tud",
"tue",
"tuf",
"tug",
"tuh",
"tui",
"tuk",
"tul",
"tum",
"tun",
"tuo",
"tuq",
"tur",
"tus",
"tuu",
"tuv",
"tux",
"tuy",
"tuz",
"tva",
"tvd",
"tve",
"tvk",
"tvl",
"tvm",
"tvo",
"tvs",
"tvu",
"tvw",
"tvy",
"twa",
"twb",
"twd",
"twe",
"twf",
"twg",
"twh",
"twi",
"twm",
"twp",
"twq",
"twt",
"twu",
"tww",
"twx",
"twy",
"txa",
"txb",
"txc",
"txe",
"txg",
"txh",
"txi",
"txm",
"txn",
"txo",
"txs",
"txt",
"txu",
"txx",
"txy",
"tya",
"tye",
"tyi",
"tyj",
"tyn",
"typ",
"tyr",
"tyu",
"tyv",
"tyx",
"tyz",
"tzh",
"tzj",
"tzl",
"tzm",
"tzn",
"tzo",
"tzx",
"uam",
"uan",
"uar",
"uba",
"ubi",
"ubl",
"ubr",
"ubu",
"uby",
"uda",
"ude",
"udi",
"udj",
"udl",
"udm",
"udu",
"ues",
"ufi",
"uga",
"uge",
"ugo",
"uha",
"uhn",
"uig",
"uiv",
"uji",
"uka",
"ukg",
"ukh",
"uki",
"ukp",
"ukq",
"ukr",
"uku",
"ukv",
"ula",
"ulc",
"ule",
"ulf",
"uli",
"ulk",
"ulm",
"uln",
"ulu",
"ulw",
"uma",
"umb",
"umc",
"umg",
"umm",
"umo",
"ump",
"umr",
"ums",
"umu",
"una",
"und",
"une",
"ung",
"unk",
"unm",
"unn",
"unr",
"unz",
"upv",
"ura",
"urb",
"urc",
"urd",
"ure",
"urf",
"urg",
"urh",
"uri",
"urk",
"urn",
"urp",
"urr",
"urt",
"uru",
"urv",
"urw",
"urx",
"ury",
"urz",
"usa",
"ush",
"usk",
"usp",
"usu",
"uta",
"ute",
"utp",
"utr",
"utu",
"uum",
"uur",
"uuu",
"uve",
"uvh",
"uvl",
"uwa",
"uya",
"uzn",
"uzs",
"vaa",
"vae",
"vaf",
"vag",
"vah",
"vai",
"vaj",
"val",
"vam",
"van",
"vao",
"var",
"vas",
"vau",
"vay",
"vbb",
"vbk",
"vec",
"ved",
"vel",
"vem",
"ven",
"veo",
"vep",
"ver",
"vic",
"vie",
"vif",
"vig",
"vil",
"vin",
"vit",
"viv",
"vka",
"vkj",
"vkl",
"vkm",
"vko",
"vkp",
"vku",
"vlp",
"vls",
"vma",
"vmb",
"vme",
"vmf",
"vmg",
"vmi",
"vmk",
"vml",
"vmm",
"vmp",
"vmr",
"vmw",
"vmy",
"vmz",
"vnk",
"vnm",
"vnp",
"vol",
"vor",
"vot",
"vra",
"vro",
"vrs",
"vrt",
"vun",
"vut",
"waa",
"wab",
"wac",
"wad",
"wae",
"wah",
"waj",
"wal",
"wam",
"wan",
"wao",
"wap",
"waq",
"war",
"was",
"wat",
"wau",
"wav",
"waw",
"wax",
"way",
"waz",
"wba",
"wbb",
"wbe",
"wbf",
"wbh",
"wbi",
"wbj",
"wbk",
"wbl",
"wbm",
"wbp",
"wbr",
"wbt",
"wbv",
"wbw",
"wca",
"wci",
"wdd",
"wdg",
"wdj",
"wea",
"wed",
"weg",
"weh",
"wei",
"weo",
"wep",
"wer",
"wes",
"wet",
"wew",
"wfg",
"wga",
"wgb",
"wgg",
"wgi",
"wgo",
"wgu",
"wgy",
"wha",
"whk",
"wib",
"wic",
"wig",
"wih",
"wii",
"wik",
"wil",
"wim",
"win",
"wir",
"wiu",
"wiv",
"wiy",
"wja",
"wji",
"wka",
"wkw",
"wla",
"wlc",
"wle",
"wlg",
"wlh",
"wli",
"wlk",
"wll",
"wlm",
"wln",
"wlo",
"wlr",
"wls",
"wlu",
"wlv",
"wlw",
"wly",
"wmb",
"wmc",
"wmd",
"wme",
"wmh",
"wmi",
"wmo",
"wms",
"wmt",
"wmw",
"wmx",
"wnb",
"wnc",
"wnd",
"wne",
"wni",
"wnk",
"wno",
"wnp",
"wnu",
"wnw",
"wny",
"woa",
"wob",
"woc",
"wod",
"woe",
"wof",
"wog",
"woi",
"wok",
"wol",
"wom",
"won",
"wos",
"wow",
"wpc",
"wrb",
"wrg",
"wrh",
"wri",
"wrk",
"wrl",
"wrm",
"wrn",
"wro",
"wrp",
"wrr",
"wrs",
"wru",
"wrv",
"wrw",
"wry",
"wrz",
"wsa",
"wsg",
"wsi",
"wsk",
"wsr",
"wsv",
"wtf",
"wth",
"wti",
"wtm",
"wtw",
"wub",
"wuh",
"wul",
"wum",
"wun",
"wut",
"wuu",
"wuv",
"wux",
"wuy",
"wwa",
"wwo",
"wwr",
"www",
"wxa",
"wyb",
"wyi",
"wym",
"wyr",
"wyy",
"xaa",
"xab",
"xac",
"xad",
"xai",
"xaj",
"xal",
"xam",
"xan",
"xap",
"xaq",
"xar",
"xas",
"xat",
"xau",
"xav",
"xaw",
"xbc",
"xbe",
"xbg",
"xbi",
"xbm",
"xbn",
"xbr",
"xbw",
"xce",
"xcg",
"xch",
"xcl",
"xcm",
"xcn",
"xco",
"xcr",
"xct",
"xcw",
"xcy",
"xdc",
"xdk",
"xdm",
"xdy",
"xeb",
"xed",
"xeg",
"xel",
"xem",
"xep",
"xer",
"xes",
"xet",
"xeu",
"xfa",
"xga",
"xgd",
"xgf",
"xgm",
"xhd",
"xho",
"xht",
"xhu",
"xib",
"xii",
"xil",
"xin",
"xir",
"xiy",
"xjt",
"xka",
"xkb",
"xke",
"xkg",
"xkl",
"xkn",
"xkq",
"xkr",
"xks",
"xkv",
"xky",
"xla",
"xlb",
"xlc",
"xld",
"xlg",
"xlo",
"xlp",
"xls",
"xlu",
"xly",
"xmb",
"xmf",
"xmg",
"xmh",
"xmk",
"xmm",
"xmn",
"xmr",
"xmt",
"xmu",
"xmv",
"xmw",
"xmy",
"xmz",
"xnb",
"xng",
"xni",
"xnm",
"xnn",
"xno",
"xns",
"xny",
"xnz",
"xod",
"xog",
"xok",
"xom",
"xon",
"xoo",
"xop",
"xor",
"xow",
"xpb",
"xpd",
"xpe",
"xpf",
"xpg",
"xpk",
"xpm",
"xpq",
"xpr",
"xpt",
"xpu",
"xpv",
"xpy",
"xqt",
"xra",
"xre",
"xri",
"xrn",
"xrt",
"xru",
"xrw",
"xsa",
"xsb",
"xsc",
"xse",
"xsh",
"xsi",
"xsl",
"xsm",
"xsp",
"xsr",
"xss",
"xsu",
"xsv",
"xsy",
"xta",
"xtc",
"xtd",
"xte",
"xtg",
"xtm",
"xto",
"xtv",
"xua",
"xub",
"xug",
"xul",
"xum",
"xuo",
"xup",
"xur",
"xut",
"xuu",
"xvi",
"xvn",
"xvs",
"xwa",
"xwc",
"xwd",
"xwg",
"xwk",
"xwo",
"xww",
"xxk",
"xxt",
"xyj",
"xyk",
"xyt",
"xyy",
"xzh",
"xzp",
"yaa",
"yab",
"yad",
"yae",
"yaf",
"yag",
"yah",
"yai",
"yaj",
"yak",
"yal",
"yam",
"yan",
"yao",
"yap",
"yaq",
"yar",
"yas",
"yat",
"yau",
"yav",
"yaw",
"yay",
"yaz",
"yba",
"ybb",
"ybe",
"ybh",
"ybi",
"ybj",
"ybl",
"ybm",
"ybn",
"ybo",
"ybx",
"yby",
"ycl",
"ycn",
"yda",
"ydd",
"ydg",
"ydk",
"yea",
"yec",
"yee",
"yer",
"yes",
"yet",
"yev",
"yey",
"yga",
"ygr",
"ygw",
"yha",
"yhl",
"yia",
"yig",
"yih",
"yii",
"yij",
"yim",
"yis",
"yix",
"yka",
"ykg",
"yki",
"ykm",
"yko",
"ykr",
"ykt",
"yky",
"yle",
"ylg",
"yli",
"yll",
"yln",
"ylr",
"ylu",
"yly",
"ymb",
"ymc",
"yme",
"yml",
"ymm",
"ymo",
"ymp",
"ynd",
"ynk",
"ynl",
"ynn",
"ynq",
"yns",
"ynu",
"yob",
"yog",
"yoi",
"yok",
"yol",
"yom",
"yon",
"yor",
"yot",
"yox",
"yoy",
"ypg",
"yph",
"ypz",
"yra",
"yrb",
"yre",
"yrk",
"yrl",
"yrn",
"yrw",
"ysn",
"ysr",
"yss",
"yua",
"yub",
"yuc",
"yue",
"yuf",
"yug",
"yui",
"yuj",
"yuk",
"yul",
"yum",
"yun",
"yup",
"yur",
"yut",
"yuw",
"yux",
"yuy",
"yuz",
"yva",
"yvt",
"ywa",
"ywn",
"ywq",
"ywr",
"ywt",
"yww",
"yxg",
"yxl",
"yyr",
"yyu",
"yzg",
"zaa",
"zab",
"zac",
"zad",
"zae",
"zaf",
"zag",
"zai",
"zaj",
"zak",
"zal",
"zao",
"zap",
"zaq",
"zar",
"zav",
"zaw",
"zay",
"zaz",
"zba",
"zbc",
"zbe",
"zbl",
"zbw",
"zca",
"zch",
"zdj",
"zea",
"zeg",
"zen",
"zga",
"zgb",
"zgh",
"zgn",
"zgr",
"zha",
"zhb",
"zhd",
"zhi",
"zhn",
"zho",
"zhw",
"zia",
"zik",
"zim",
"zin",
"ziw",
"ziz",
"zka",
"zkb",
"zkd",
"zkg",
"zkk",
"zko",
"zkp",
"zkr",
"zkt",
"zku",
"zlj",
"zlm",
"zln",
"zma",
"zmb",
"zmc",
"zmd",
"zme",
"zmg",
"zmj",
"zmk",
"zml",
"zmm",
"zmn",
"zmo",
"zmp",
"zmr",
"zmt",
"zmu",
"zmv",
"zmw",
"zmx",
"zmy",
"zmz",
"zne",
"zng",
"zns",
"zoc",
"zoh",
"zom",
"zoq",
"zor",
"zos",
"zpc",
"zpf",
"zpi",
"zpl",
"zpm",
"zpn",
"zpo",
"zpq",
"zpr",
"zps",
"zpu",
"zpv",
"zpw",
"zpx",
"zpz",
"zqe",
"zro",
"zrp",
"zrs",
"zsa",
"zsm",
"zsu",
"ztg",
"ztp",
"ztq",
"zts",
"ztu",
"zua",
"zuh",
"zul",
"zum",
"zun",
"zwa",
"zyb",
"zyg",
"zyj",
"zyn",
"zzj"
] | TAGS
#size_categories-10M<n<100M #language-Ghotuo #language-Alumu-Tesu #language-Ari #language-Amal #language-Arbëreshë Albanian #language-Ambrak #language-Arifama-Miniafia #language-Ankave #language-Afade #language-Anambé #language-Algerian Saharan Arabic #language-Pará Arára #language-Eastern Abnaki #language-Afar #language-Aasáx #language-Arvanitika Albanian #language-Abau #language-Solong #language-Mandobo Atas #language-Abé #language-Bankon #language-Western Abnaki #language-Abai Sungai #language-Abaga #language-Abidji #language-Aka-Bea #language-Abkhazian #language-Lampung Nyo #language-Abanyom #language-Abua #language-Abon #language-Abaza #language-Abron #language-Ambonese Malay #language-Ambulas #language-Abure #language-Baharna Arabic #language-Pal #language-Inabaknon #language-Aneme Wake #language-Abui #language-Achagua #language-Gikyode #language-Achinese #language-Saint Lucian Creole French #language-Acoli #language-Aka-Cari #language-Aka-Kora #language-Akar-Bale #language-Mesopotamian Arabic #language-Achang #language-Eastern Acipa #language-Ta'izzi-Adeni Arabic #language-Achi #language-Acroá #language-Achterhoeks #language-Achuar-Shiwiar #language-Achumawi #language-Hijazi Arabic #language-Omani Arabic #language-Cypriot Arabic #language-Adangme #language-Lidzonka #language-Adele #language-Andegerebinha #language-Adhola #language-Adi #language-Adioukrou #language-Galo #language-Adang #language-Abu #language-Adangbe #language-Adonara #language-Adnyamathanha #language-Amundava #language-Amdo Tibetan #language-Adyghe #language-Adzera #language-Tunisian Arabic #language-Saidi Arabic #language-Northeast Pashai #language-Ambele #language-Arem #language-Eastern Arrernte #language-Alsea #language-Ambakich #language-Amele #language-Aeka #language-Gulf Arabic #language-Andai #language-Putukwam #language-Akrukay #language-Nanubae #language-Defaka #language-Eloyi #language-Tapei #language-Afrikaans #language-Afitti #language-Awutu #language-Obokuitai #language-Legbo #language-Agatu #language-Agarabi #language-Angal #language-Arguni #language-Angor #language-Ngelima #language-Argobba #language-Fembe #language-Angaataha #language-Agutaynen #language-Tainae #language-Aghem #language-Aguaruna #language-Esimbi #language-Central Cagayan Agta #language-Aguacateco #language-Remontado Dumagat #language-Kahua #language-Aghul #language-Ahanta #language-Axamb #language-Qimant #language-Aghu #language-Tiagbamrin Aizi #language-Akha #language-Igo #language-Ahom #language-Ahirani #language-Ashe #language-Ahtena #language-Arosi #language-Ainu (China) #language-Alngith #language-Amara #language-Agi #language-Ai-Cham #language-Assyrian Neo-Aramaic #language-Ake #language-Aimele #language-Aimol #language-Ainu (Japan) #language-Aiton #language-Airoran #language-Arikem #language-Aari #language-Aighon #language-Aja (South Sudan) #language-Aja (Benin) #language-Ajië #language-South Levantine Arabic #language-Amri Karbi #language-Akan #language-Batak Angkola #language-Mpur #language-Ukpet-Ehom #language-Akawaio #language-Akpa #language-Anakalangu #language-Angal Heneng #language-Aiome #language-Aka-Jeru #language-Akkadian #language-Aklanon #language-Aka-Bo #language-Akurio #language-Siwu #language-Ak #language-Araki #language-Akaselem #language-Akum #language-Akhvakh #language-Akwa #language-Aka-Kede #language-Aka-Kol #language-Alabama #language-Alago #language-Qawasqar #language-Alladian #language-Aleut #language-Alege #language-Alawa #language-Amaimon #language-Alangan #language-Alak #language-Allar #language-Amblong #language-Gheg Albanian #language-Larike-Wakasihu #language-Alune #language-Algonquin #language-Alutor #language-Tosk Albanian #language-Southern Altai #language-'Are'are #language-Alaba-K’abeena #language-Amol #language-Alyawarr #language-Alur #language-Amanayé #language-Amahuaca #language-Yanesha' #language-Hamer-Banna #language-Amurdak #language-Amharic #language-Amis #language-Amdang #language-Ambai #language-War-Jaintia #language-Ama (Papua New Guinea) #language-Amanab #language-Amo #language-Alamblak #language-Amahai #language-Amarakaeri #language-Southern Amami-Oshima #language-Amto #language-Guerrero Amuzgo #language-Ambelau #language-Western Neo-Aramaic #language-Anmatyerre #language-Ami #language-Atampaya #language-Andaqui #language-Andoa #language-Ngas #language-Ansus #language-Xârâcùù #language-Animere #language-Old English (ca. 450-1100) #language-Nend #language-Andi #language-Anor #language-Goemai #language-Anal #language-Obolo #language-Andoque #language-Angika #language-Jarawa (India) #language-Antakarinya #language-Anuak #language-Denya #language-Anaang #language-Andra-Hus #language-Anyin #language-Anem #language-Angolar #language-Pemon #language-Andarum #language-Bragat #language-Angoram #language-Anindilyakwa #language-Mufian #language-Alor #language-Ömie #language-Bumbita Arapesh #language-Aore #language-Taikat #language-Atong (India) #language-A'ou #language-Atorada #language-Uab Meto #language-Sa'a #language-Levantine Arabic #language-Sudanese Arabic #language-Bukiyip #language-Athpariya #language-Apiaká #language-Jicarilla Apache #language-Kiowa Apache #language-Lipan Apache #language-Mescalero-Chiricahua Apache #language-Apinayé #language-Ambul #language-Apma #language-A-Pucikwar #language-Arop-Lokep #language-Apatani #language-Apurinã #language-Western Apache #language-Aputai #language-Apalaí #language-Safeyoka #language-Archi #language-Ampari Dogon #language-Arigidi #language-Atohwaim #language-Atakapa #language-Angaité #language-Akuntsu #language-Arabic #language-Standard Arabic #language-Official Aramaic (700-300 BCE) #language-Arabana #language-Western Arrarnta #language-Aragonese #language-Arhuaco #language-Arikara #language-Arikapú #language-Arabela #language-Mapudungun #language-Araona #language-Arapaho #language-Algerian Arabic #language-Karo (Brazil) #language-Najdi Arabic #language-Arbore #language-Arawak #language-Aruá (Rodonia State) #language-Moroccan Arabic #language-Egyptian Arabic #language-Asu (Tanzania) #language-Assiniboine #language-Casuarina Coast Asmat #language-American Sign Language #language-Cishingini #language-Abishira #language-Buruwai #language-Sari #language-Ashkun #language-Asilulu #language-Assamese #language-Xingú Asuriní #language-Dano #language-Asuri #language-Ipulo #language-Asturian #language-Tocantins Asurini #language-Asoa #language-Muratayak #language-Yaosakor Asmat #language-As #language-Pele-Ata #language-Zaiwa #language-Atsahuaca #language-Ata Manobo #language-Atemble #language-Attié #language-Atikamekw #language-Ati #language-Ata #language-Atong (Cameroon) #language-Aralle-Tabulahan #language-Waimiri-Atroari #language-Gros Ventre #language-Pamplona Atta #language-Northern Altai #language-Atsugewi #language-Arutani #language-Aneityum #language-Arta #language-Asumboa #language-Waorani #language-Anuta #language-Aushi #language-Anuki #language-Awjilah #language-Heyo #language-Aulua #language-Asu (Nigeria) #language-Molmo One #language-Makayam #language-Austral #language-Auye #language-Awyi #language-Awiyaana #language-Uzbeki Arabic #language-Avaric #language-Avau #language-Alviri-Vidari #language-Avestan #language-Avikam #language-Kotava #language-Angkamuthi #language-Avatime #language-Au #language-Avokaya #language-Avá-Canoeiro #language-Awadhi #language-Awa (Papua New Guinea) #language-Cicipu #language-Awetí #language-Anguthimri #language-Awbono #language-Aekyom #language-Awabakal #language-Arawum #language-Awngi #language-Awak #language-Awera #language-South Awyu #language-Araweté #language-Jair Awyu #language-Awara #language-Edera Awyu #language-Abipon #language-Ayerrerenge #language-Mato Grosso Arára #language-Yaka (Central African Republic) #language-Lower Southern Aranda #language-Middle Armenian #language-Xârâgurè #language-Awar #language-Ayizo Gbe #language-Southern Aymara #language-Ayabadhu #language-Ayere #language-Ginyanga #language-Hadrami Arabic #language-Leyigha #language-Libyan Arabic #language-Aymara #language-Sanaani Arabic #language-Ayoreo #language-Central Aymara #language-Ayu #language-Mai Brat #language-Azha #language-South Azerbaijani #language-Eastern Durango Nahuatl #language-San Pedro Amuzgos Amuzgo #language-North Azerbaijani #language-Ipalapa Amuzgo #language-Western Durango Nahuatl #language-Awing #language-Highland Puebla Nahuatl #language-Babatana #language-Bainouk-Gunyuño #language-Baré #language-Nubaca #language-Tuki #language-Barakai #language-Bashkir #language-Baluchi #language-Bambara #language-Balinese #language-Waimaha #language-Bantawa #language-Bavarian #language-Basa (Cameroon) #language-Bada (Nigeria) #language-Vengo #language-Bambili-Bambui #language-Bamun #language-Baatonum #language-Barai #language-Batak Toba #language-Bau #language-Bangba #language-Baibai #language-Bugan #language-Barombi #language-Ghomálá' #language-Babanki #language-Bats #language-Uneapa #language-West Central Banda #language-Bamali #language-Girawa #language-Bakpinka #language-Mburku #language-Kulung (Nigeria) #language-Karnai #language-Baba #language-Befang #language-Central Bai #language-Southern Balochi #language-North Babar #language-Bamenyam #language-Bamu #language-Baga Pokur #language-Bariai #language-Baoulé #language-Bardi #language-Bunuba #language-Central Bikol #language-Bannoni #language-Bali (Nigeria) #language-Kaluli #language-Bali (Democratic Republic of Congo) #language-Bench #language-Babine #language-Kohumono #language-Awad Bing #language-Shoo-Minda-Nye #language-Bana #language-Bacama #language-Bainouk-Gunyaamolo #language-Bayot #language-Basap #language-Emberá-Baudó #language-Bunama #language-Bade #language-Bonggi #language-Baka (South Sudan) #language-Burun #language-Bai (South Sudan) #language-Budukh #language-Indonesian Bajau #language-Buduma #language-Baldemu #language-Bende #language-Bahnar #language-West Coast Bajau #language-Burunge #language-Oroko #language-Baham #language-Budong-Budong #language-Bandjalang #language-Beaver #language-Bebele #language-Iceve-Maci #language-Bedoanas #language-Byangsi #language-Benabena #language-Belait #language-Biali #language-Bekati' #language-Beja #language-Bebeli #language-Belarusian #language-Bemba (Zambia) #language-Bengali #language-Beami #language-Besoa #language-Beembe #language-Guiberoua Béte #language-Blagar #language-Daloa Bété #language-Betawi #language-Jur Modo #language-Beli (Papua New Guinea) #language-Bena (Tanzania) #language-Bari #language-Panyi Bai #language-Bafut #language-Bofi #language-Busang Kayan #language-Blafe #language-Bafanji #language-Banda-Ndélé #language-Mmen #language-Bunak #language-Beba #language-Badaga #language-Southern Bai #language-Balti #language-Gahri #language-Bondo #language-Bagheli #language-Mahasu Pahari #language-Haryanvi #language-Rathwi Bareli #language-Bangandu #language-Bugun #language-Giangan #language-Bangolan #language-Bit #language-Bo (Laos) #language-Western Balochi #language-Baga Koga #language-Eastern Balochi #language-Bagri #language-Bawm Chin #language-Tagabawa #language-Bughotu #language-Mbongno #language-Warkay-Bipim #language-Banggai #language-Bhili #language-Biga #language-Bhadrawahi #language-Odiai #language-Binandere #language-Bukharic #language-Bahing #language-Bimin #language-Bohtan Neo-Aramaic #language-Bhojpuri #language-Bima #language-Tukang Besi South #language-Bara Malagasy #language-Bhattiyali #language-Biak #language-Bhele #language-Bada (Indonesia) #language-Badimaya #language-Bissa #language-Bidiyo #language-Bepour #language-Biafada #language-Biangai #language-Bile #language-Bimoba #language-Bini #language-Nai #language-Bipi #language-Bisorio #language-Bislama #language-Berinomo #language-Biete #language-Southern Birifor #language-Kol (Cameroon) #language-Birhor #language-Baloi #language-Budza #language-Banggarla #language-Bariji #language-Biao-Jiao Mien #language-Barzani Jewish Neo-Aramaic #language-Bidyogo #language-Bahinemo #language-Burji #language-Barok #language-Bulu (Papua New Guinea) #language-Bajelani #language-Banjar #language-Fanamaket #language-Binumarien #language-Bajan #language-Balanta-Ganja #language-Bedjond #language-Bakwé #language-Bayali #language-Baruga #language-Baka (Cameroon) #language-Binukid #language-Bakoko #language-Baki #language-Pande #language-Brokskat #language-Berik #language-Kom (Cameroon) #language-Bukitan #language-Kwa' #language-Bakairí #language-Bakumpai #language-Northern Sorsoganon #language-Buhid #language-Bekwarra #language-Bekwel #language-Baikeno #language-Bokyi #language-Bungku #language-Siksika #language-Bilua #language-Bella Coola #language-Bolango #language-Balanta-Kentohe #language-Buol #language-Kuwaa #language-Bolia #language-Bolongan #language-Pa'o Karen #language-Biloxi #language-Beli (South Sudan) #language-Southern Catanduanes Bikol #language-Blablanga #language-Baluan-Pam #language-Blang #language-Balaesang #language-Tai Dam #language-Kibala #language-Balangao #language-Mag-Indi Ayta #language-Notre #language-Balantak #language-Bembe #language-Biem #language-Baga Manduri #language-Limassa #language-Bom-Kim #language-Bamwe #language-Kein #language-Bagirmi #language-Ghayavi #language-Northern Betsimisaraka Malagasy #language-Bina (Papua New Guinea) #language-Bambalang #language-Muinane #language-Biao Mon #language-Somba-Siawari #language-Bum #language-Baimak #language-Baramu #language-Bonerate #language-Bookan #language-Bontok #language-Banda (Indonesia) #language-Bintauna #language-Masiwang #language-Benga #language-Bangi #language-Eastern Tawbuid #language-Bierebo #language-Boon #language-Batanga #language-Bunun #language-Bantoanon #language-Bola #language-Bantik #language-Butmas-Tur #language-Bundeli #language-Bonerif #language-Bisis #language-Bangubangu #language-Bintulu #language-Beezen #language-Bora #language-Aweer #language-Tibetan #language-Mundabli #language-Bolon #language-Boma #language-Barbareño #language-Anjam #language-Bole #language-Berom #language-Bine #language-Bonkiman #language-Bogaya #language-Borôro #language-Bosnian #language-Bongo #language-Bondei #language-Tuwuli #language-Buamu #language-Bodo (Central African Republic) #language-Tiéyaxo Bozo #language-Daakaka #language-Bonggo #language-Botlikh #language-Bagupi #language-Binji #language-Orowe #language-Biyom #language-Dzao Min #language-Kaure #language-Banda Malay #language-Koronadal Blaan #language-Sarangani Blaan #language-Bongu #language-Bian Marind #language-Bo (Papua New Guinea) #language-Bishnupriya #language-Bilba #language-Boko (Benin) #language-Bago-Kusuntu #language-Bakhtiari #language-Bilakura #language-Busa #language-Biritai #language-Burusu #language-Bosngun #language-Bamukumbit #language-Koro Wachi #language-Buru (Nigeria) #language-Bakaka #language-Brao #language-Berbice Creole Dutch #language-Baraamu #language-Breton #language-Bira #language-Baure #language-Brahui #language-Mokpwe #language-Bieria #language-Birked #language-Birwa #language-Barambu #language-Boruca #language-Barapasi #language-Breri #language-Birao #language-Bitare #language-Eastern Bru #language-Western Bru #language-Bodo (India) #language-Bilbil #language-Brunei Bisaya #language-Bassari #language-Wushi #language-Bauchi #language-Bashkardi #language-Kati #language-Bassossi #language-Burushaski #language-Basa-Gumna #language-Busami #language-Barasana-Eduria #language-Baga Sitemu #language-Bassa #language-Akoose #language-Basketo #language-Bahonsuai #language-Baiso #language-Yangkam #language-Sabah Bisaya #language-Bata #language-Bati (Cameroon) #language-Batak Dairi #language-Gamo-Ningi #language-Birgit #language-Biatah Bidayuh #language-Bacanese Malay #language-Batak Mandailing #language-Ratagnon #language-Rinconada Bikol #language-Budibud #language-Batek #language-Baetora #language-Batak Simalungun #language-Bete-Bendi #language-Batu #language-Bateri #language-Butuanon #language-Batak Karo #language-Bobot #language-Batak Alas-Kluet #language-Buriat #language-Bua #language-Bushi #language-Ntcham #language-Beothuk #language-Bushoong #language-Buginese #language-Younuo Bunu #language-Bugawac #language-Bulgarian #language-Bulu (Cameroon) #language-Sherbro #language-Terei #language-Busoa #language-Brem #language-Bokobaru #language-Bungain #language-Bubi #language-Boghom #language-Bullom So #language-Barein #language-Bube #language-Baelelea #language-Baeggu #language-Bonkeng #language-Belanda Viri #language-Baan #language-Bukat #language-Bamunka #language-Buna #language-Bolgo #language-Birri #language-Burarra #language-Baniva #language-Boga #language-Dibole #language-Bauzi #language-Bwatoo #language-Namosi-Naitasiri-Serua #language-Bwaidoka #language-Bwe Karen #language-Boselewa #language-Barwe #language-Baniwa #language-Bauwaki #language-Bwela #language-Biwat #language-Wunai Bunu #language-Boro (Ethiopia) #language-Mandobo Bawah #language-Southern Bobo Madaré #language-Bura-Pabir #language-Bomboma #language-Bafaw-Balong #language-Buli (Ghana) #language-Bwa #language-Bu-Nao Bunu #language-Tairaha #language-Belanda Bor #language-Pela #language-Birale #language-Bilur #language-Bangala #language-Buhutu #language-Pirlatapa #language-Bayungu #language-Bukusu #language-Jalkunan #language-Mongolia Buriat #language-Burduna #language-Beele #language-Russia Buriat #language-China Buriat #language-Berakou #language-Bankagooma #language-Binahari #language-Batak #language-Benyadu' #language-Pouye #language-Bete #language-Buyu #language-Biao #language-Bayono #language-Bidjara #language-Bilin #language-Biyo #language-Bumaji #language-Basay #language-Baruya #language-Burak #language-Berti #language-Medumba #language-Belhariya #language-Qaqet #language-Banaro #language-Bandi #language-Andio #language-Southern Betsimisaraka Malagasy #language-Bribri #language-Jenaama Bozo #language-Boikin #language-Babuza #language-Mapos Buang #language-Bisu #language-Belize Kriol English #language-Nicaragua Creole English #language-Boano (Sulawesi) #language-Boano (Maluku) #language-Kemberano #language-Buli (Indonesia) #language-Biri #language-Brithenig #language-Burmeso #language-Naami #language-Basa (Nigeria) #language-Kɛlɛngaxo Bozo #language-Obanliku #language-Evant #language-Chortí #language-Garifuna #language-Chuj #language-Caddo #language-Lehar #language-Southern Carrier #language-Nivaclé #language-Cahuarano #language-Chané #language-Kaqchikel #language-Carolinian #language-Cemuhî #language-Chambri #language-Chácobo #language-Chipaya #language-Car Nicobarese #language-Galibi Carib #language-Tsimané #language-Catalan #language-Cavineña #language-Callawalla #language-Chiquitano #language-Cayuga #language-Canichana #language-Cabiyarí #language-Carapana #language-Carijona #language-Chimila #language-Chachi #language-Ede Cabe #language-Chavacano #language-Nyahkur #language-Izora #language-Cashibo-Cacataibo #language-Cashinahua #language-Chayahuita #language-Candoshi-Shapra #language-Cacua #language-Carabayo #language-Chamicuro #language-Chopi #language-Samba Daka #language-Atsam #language-Kasanga #language-Comaltepec Chinantec #language-Chakma #language-Cacaopera #language-Chiru #language-Chambeali #language-Churahi #language-Chepang #language-Chaudangsi #language-Min Dong Chinese #language-Cinda-Regi-Tiyal #language-Chadong #language-Koda #language-Lower Chehalis #language-Cebuano #language-Chamacoco #language-Czech #language-Centúúm #language-Dijim-Bwilim #language-Cara #language-Como Karim #language-Falam Chin #language-Kagayanen #language-Chiga #language-Chamorro #language-Chibcha #language-Catawba #language-Highland Oaxaca Chontal #language-Chechen #language-Tabasco Chontal #language-Chagatai #language-Chinook #language-Ojitlán Chinantec #language-Chuukese #language-Cahuilla #language-Mari (Russia) #language-Chinook jargon #language-Choctaw #language-Chipewyan #language-Quiotepec Chinantec #language-Cherokee #language-Cholón #language-Church Slavic #language-Chuvash #language-Chuwabu #language-Chantyal #language-Cheyenne #language-Ozumacín Chinantec #language-Cia-Cia #language-Chickasaw #language-Chimariko #language-Cineni #language-Chitkuli Kinnauri #language-Cimbrian #language-Cinta Larga #language-Chiapanec #language-Tiri #language-Chippewa #language-Chaima #language-Western Cham #language-Chru #language-Upper Chehalis #language-Chamalal #language-Chokwe #language-Eastern Cham #language-Chenapian #language-Ashéninka Pajonal #language-Cabécar #language-Shor #language-Chuave #language-Jinyu Chinese #language-Central Kurdish #language-Chak #language-Cibak #language-Anufo #language-Kajakse #language-Tayo #language-Chukot #language-Koasati #language-Kavalan #language-Caka #language-Ron #language-Chilcotin #language-Chaldean Neo-Aramaic #language-Lealao Chinantec #language-Chilisso #language-Chakali #language-Idu-Mishmi #language-Chala #language-Clallam #language-Lowland Oaxaca Chontal #language-Chulym #language-Eastern Highland Chatino #language-Cerma #language-Classical Mongolian #language-Emberá-Chamí #language-Campalagian #language-Mandarin Chinese #language-Central Mnong #language-Chinbon Chin #language-Northern Qiang #language-Hakha Chin #language-Asháninka #language-Khumi Chin #language-Lalana Chinantec #language-Northern Ping Chinese #language-Montenegrin #language-Central Asmat #language-Tepetotutla Chinantec #language-Chenoua #language-Middle Cornish #language-Chicomuceltec #language-Cocopa #language-Cocama-Cocamilla #language-Koreguaje #language-Colorado #language-Chong #language-Chonyi-Dzihana-Kauma #language-Cochimi #language-Santa Teresa Cora #language-Columbia-Wenatchi #language-Comanche #language-Cofán #language-Comox #language-Coptic #language-Coquille #language-Cornish #language-Corsican #language-Caquinte #language-Wamey #language-Cowlitz #language-Chochotec #language-Palantla Chinantec #language-Ucayali-Yurúa Ashéninka #language-Ajyíninka Apurucayali #language-Cappadocian Greek #language-Chinese Pidgin English #language-Cherepon #language-Capiznon #language-Pichis Ashéninka #language-Pu-Xian Chinese #language-Chuanqiandian Cluster Miao #language-Chara #language-Island Carib #language-Lonwolwol #language-Coeur d'Alene #language-Cree #language-Michif #language-Crimean Tatar #language-Sãotomense #language-Southern East Cree #language-Plains Cree #language-Northern East Cree #language-Moose Cree #language-El Nayar Cora #language-Crow #language-Iyo'wujwa Chorote #language-Carolina Algonquian #language-Seselwa Creole French #language-Iyojwa'ja Chorote #language-Chrau #language-Carrier #language-Cori #language-Cruzeño #language-Chiltepec Chinantec #language-Kashubian #language-Asho Chin #language-Coast Miwok #language-Central Sierra Miwok #language-Sochiapam Chinantec #language-Southern Ping Chinese #language-Southern Ohlone #language-Northern Ohlone #language-Swampy Cree #language-Siyin Chin #language-Coos #language-Tataltepec Chatino #language-Chetco #language-Tedim Chin #language-Tepinapa Chinantec #language-Chittagonian #language-Tlacoatzintepec Chinantec #language-Chitimacha #language-Chhintange #language-Emberá-Catío #language-Western Highland Chatino #language-Northern Catanduanes Bikol #language-Chol #language-Zacatepec Chatino #language-Cua #language-Cubeo #language-Usila Chinantec #language-Chuka #language-Cuiba #language-Mashco Piro #language-San Blas Kuna #language-Culina #language-Cumanagoto #language-Cupeño #language-Cun #language-Chhulung #language-Teutila Cuicatec #language-Tai Ya #language-Cuvok #language-Tepeuxila Cuicatec #language-Cuitlatec #language-Chug #language-Valle Nacional Chinantec #language-Woods Cree #language-Kwere #language-Chewong #language-Kuwaataay #language-Nopala Chatino #language-Cayubaba #language-Welsh #language-Cuyonon #language-Huizhou Chinese #language-Knaanic #language-Zenzontepec Chatino #language-Min Zhong Chinese #language-Zotung Chin #language-Dangaléat #language-Dambi #language-Marik #language-Dagbani #language-Gwahatike #language-Day #language-Dar Fur Daju #language-Dakota #language-Dahalo #language-Damakawa #language-Danish #language-Dandami Maria #language-Dargwa #language-Dar Sila Daju #language-Taita #language-Davawenyo #language-Dayi #language-Bangime #language-Deno #language-Dadiya #language-Dabe #language-Edopi #language-Dogul Dom Dogon #language-Doka #language-Ida'an #language-Dyirbal #language-Duguri #language-Duriankere #language-Duwai #language-Daba #language-Ben Tey Dogon #language-Bondum Dom Dogon #language-Bankan Tey Dogon #language-Dibiyaso #language-Deccan #language-Negerhollands #language-Dadi Dadi #language-Dongotono #language-Fataluku #language-West Goodenough #language-Jaru #language-Dendi (Benin) #language-Dido #language-Donno So Dogon #language-Dawera-Daweloor #language-Dagik #language-Dedua #language-Dewoin #language-Degema #language-Demisa #language-Delaware #language-Dem #language-Slave (Athapascan) #language-Pidgin Delaware #language-Deori #language-Desano #language-German #language-Domung #language-Southern Dagaare #language-Bunoge Dogon #language-Casiguran Dumagat Agta #language-Dagaari Dioula #language-Degenan #language-Doga #language-Dghwede #language-Northern Dagara #language-Andaandi #language-Dogri (individual language) #language-Dogrib #language-Ndra'ngith #language-Daungwurrung #language-Daga #language-Dhundari #language-Dhangu-Djangu #language-Dhimal #language-Dhalandji #language-Dhargari #language-Dhaiso #language-Dhurga #language-Dehu #language-Dia #language-South Central Dinka #language-Lakota Dida #language-Didinga #language-Dieri #language-Digo #language-Kumiai #language-Dimbong #language-Dai #language-Southwestern Dinka #language-Dilling #language-Dime #language-Dinka #language-Dibo #language-Northeastern Dinka #language-Dimli (individual language) #language-Dirim #language-Dimasa #language-Diriku #language-Dhivehi #language-Northwestern Dinka #language-Dixon Reef #language-Ding #language-Djadjawurrung #language-Djinba #language-Dar Daju Daju #language-Djamindjung #language-Zarma #language-Djinang #language-Djeebbana #language-Eastern Maroon Creole #language-Jamsay Dogon #language-Jawoyn #language-Djambarrpuyngu #language-Kapriman #language-Djawi #language-Kuijau #language-Southeastern Dinka #language-Dolgan #language-Dahalik #language-Dalmatian #language-Duma #language-Mombo Dogon #language-Gavak #language-Madhi Madhi #language-Dugwor #language-Medefaidrin #language-Upper Kinabatangan #language-Domaaki #language-Dameli #language-Dama #language-Kemedzung #language-East Damar #language-Dampelas #language-Dubu #language-Dumpas #language-Mudburra #language-Demta #language-Upper Grand Valley Dani #language-Daonda #language-Ndendeule #language-Dungan #language-Lower Grand Valley Dani #language-Dan #language-Dzùùngoo #language-Danaru #language-Mid Grand Valley Dani #language-Danau #language-Western Dani #language-Dení #language-Dom #language-Dobu #language-Northern Dong #language-Doe #language-Domu #language-Dong #language-Dogri (macrolanguage) #language-Dondo #language-Doso #language-Toura (Papua New Guinea) #language-Dongo #language-Dori'o #language-Dogosé #language-Dass #language-Doyayo #language-Bussa #language-Dompo #language-Dorze #language-Papar #language-Dair #language-Darmiya #language-Dolpo #language-Rungus #language-C'Lela #language-Paakantyi #language-West Damar #language-Daro-Matu Melanau #language-Gedeo #language-Rukai #language-Darai #language-Lower Sorbian #language-Daasanach #language-Dusner #language-Desiya #language-Tadaksahak #language-Daur #language-Labuk-Kinabatangan Kadazan #language-Ditidaht #language-Adithinngithigh #language-Ana Tinga Dogon #language-Tene Kan Dogon #language-Tomo Kan Dogon #language-Tommo So Dogon #language-Kadazan Dusun #language-Lotud #language-Toro So Dogon #language-Toro Tegu Dogon #language-Tebul Ure Dogon #language-Dotyali #language-Duala #language-Duna #language-Umiray Dumaget Agta #language-Dumbea #language-Duruma #language-Uyajitaya #language-Middle Dutch (ca. 1050-1350) #language-Dusun Deyah #language-Dupaninan Agta #language-Duano #language-Dusun Malang #language-Dii #language-Dumi #language-Drung #language-Duvle #language-Dusun Witu #language-Duungooma #language-Duau #language-Diri #language-Dawik Kui #language-Dawro #language-Dutton World Speedwords #language-Dhuwal #language-Dawawa #language-Dyan #language-Dyaberdyaber #language-Dyugun #language-Djimini Senoufo #language-Yanda Dom Dogon #language-Dyangadi #language-Jola-Fonyi #language-Dyula #language-Djabugay #language-Djiwarli #language-Dazaga #language-Dzongkha #language-Ebughu #language-Eastern Bontok #language-Teke-Ebo #language-Ebrié #language-Embu #language-Eteocypriot #language-E #language-Efai #language-Efik #language-Ega #language-Emilian #language-Eggon #language-Egyptian (Ancient) #language-Ehueun #language-Eipomek #language-Eitiep #language-Askopan #language-Ejamat #language-Ekajuk #language-Ekit #language-Ekari #language-Standard Estonian #language-Kol (Bangladesh) #language-Elip #language-Koti #language-Ekpeye #language-Yace #language-Eastern Kayah #language-Elepi #language-Nding #language-Elkei #language-Modern Greek (1453-) #language-Eleme #language-El Molo #language-Elamite #language-Emai-Iuleha-Ora #language-Embaloh #language-Emerillon #language-Eastern Meohang #language-Mussau-Emira #language-Eastern Maninkakan #language-Eman #language-Northern Emberá #language-Pacific Gulf Yupik #language-Emplawas #language-Epigraphic Mayan #language-Apali #language-Markweeta #language-En #language-Ende #language-Forest Enets #language-English #language-Tundra Enets #language-Enlhet #language-Middle English (1100-1500) #language-Engenni #language-Enggano #language-Enga #language-Emumu #language-Enwan (Akwa Ibom State) #language-Enxet #language-Beti (Côte d'Ivoire) #language-Epie #language-Esperanto #language-Sie #language-Eruwa #language-Ogea #language-South Efate #language-Horpa #language-Erre #language-Ersu #language-Eritai #language-Erokwanas #language-Ese Ejja #language-Aheri Gondi #language-Eshtehardi #language-North Alaskan Inupiatun #language-Northwest Alaska Inupiatun #language-Estonian Sign Language #language-Esselen #language-Central Siberian Yupik #language-Central Yupik #language-Etebi #language-Etchemin #language-Eton (Vanuatu) #language-Eton (Cameroon) #language-Edolo #language-Yekhee #language-Etruscan #language-Ejagham #language-Eten #language-Semimi #language-Basque #language-Even #language-Uvbie #language-Evenki #language-Ewe #language-Ewondo #language-Extremaduran #language-Eyak #language-Keiyo #language-Fasu #language-Fa d'Ambu #language-Wagi #language-Fagani #language-Finongan #language-Faiwol #language-Faita #language-Fang (Cameroon) #language-South Fali #language-Fam #language-Fang (Equatorial Guinea) #language-Faroese #language-Paloor #language-Fataleka #language-Fanti #language-Fayu #language-Fala #language-Southwestern Fars #language-West Albay Bikol #language-Feroge #language-Maasina Fulfulde #language-Fongoro #language-Nobiin #language-Fyer #language-Fijian #language-Filipino #language-Finnish #language-Fipa #language-Firan #language-Tornedalen Finnish #language-Kirya-Konzəl #language-Kven Finnish #language-Kalispel-Pend d'Oreille #language-Foau #language-Fali #language-North Fali #language-Flinders Island #language-Fuliiru #language-Flaaitaal #language-Fe'fe' #language-Fanagalo #language-Fania #language-Foodo #language-Foi #language-Foma #language-Fon #language-Fore #language-Siraya #language-Fernando Po Creole English #language-Fas #language-French #language-Cajun French #language-Fordata #language-Frankish #language-Middle French (ca. 1400-1600) #language-Old French (842-ca. 1400) #language-Arpitan #language-Forak #language-Northern Frisian #language-Fortsenal #language-Western Frisian #language-Adamawa Fulfulde #language-Pulaar #language-East Futuna #language-Pular #language-Western Niger Fulfulde #language-Ko #language-Fulah #language-Fulniô #language-Friulian #language-Futuna-Aniwa #language-Furu #language-Nigerian Fulfulde #language-Fuyug #language-Fur #language-Fwâi #language-Ga #language-Gabri #language-Mixed Great Andamanese #language-Gaddang #language-Guarequena #language-Gende #language-Gagauz #language-Alekano #language-Borei #language-Gadsup #language-Galolen #language-Kandawo #language-Gan Chinese #language-Gants #language-Gal #language-Gata' #language-Galeya #language-Kenati #language-Nobonob #language-Borana-Arsi-Guji Oromo #language-Gayo #language-West Central Oromo #language-Gbaya (Central African Republic) #language-Kaytetye #language-Karajarri #language-Niksek #language-Gbanziri #language-Galela #language-Bodo Gadaba #language-Garhwali #language-Gbaya-Bossangoa #language-Gbagyi #language-Gagadu #language-Gabi-Gabi #language-Eastern Xwla Gbe #language-Gbari #language-Zoroastrian Dari #language-Mali #language-Ganggalida #language-Galice #language-Guadeloupean Creole French #language-Grenadian Creole English #language-Gaina #language-Guianese Creole French #language-Pottangi Ollar Gadaba #language-Gugu Badhun #language-Gedaged #language-Gude #language-Guduf-Gava #language-Ga'dang #language-Gadjerawang #language-Gundi #language-Gurdjar #language-Dirasha #language-Laal #language-Umanakaina #language-Ghodoberi #language-Mehri #language-Wipi #language-Gudu #language-Godwari #language-Geruma #language-Kire #language-Gade #language-Gengle #language-Hutterite German #language-Gebe #language-Gen #language-Ywom #language-ut-Ma'in #language-Geme #language-Geser-Gorom #language-Eviya #language-Gera #language-Enya #language-Geez #language-Patpatar #language-Gafat #language-Gao #language-Gbii #language-Gugadj #language-Gurr-goni #language-Kungarakany #language-Ganglau #language-Gitua #language-Gagu #language-Gogodala #language-Ghadamès #language-Geko Karen #language-Ghulfan #language-Ghanongga #language-Ghomara #language-Guhu-Samane #language-Kuke #language-Kija #language-Gail #language-Gidar #language-Gilbertese #language-Gimi (Eastern Highlands) #language-Hinukh #language-Gimi (West New Britain) #language-Green Gelao #language-Red Gelao #language-North Giziga #language-Gitxsan #language-White Gelao #language-Gilima #language-Giyug #language-South Giziga #language-Gunditjmara #language-Gonja #language-Gujari #language-Ndai #language-Gokana #language-Guinea Kpelle #language-ǂUngkue #language-Scottish Gaelic #language-Bon Gula #language-Nanai #language-Irish #language-Galician #language-Northwest Pashai #language-Gula Iro #language-Gilaki #language-Galambu #language-Manx #language-Glavda #language-Gule #language-Gambera #language-Gula'alaa #language-Middle High German (ca. 1050-1500) #language-Middle Low German #language-Gbaya-Mbodomo #language-Gumalu #language-Gamo #language-Mycenaean Greek #language-Kaansa #language-Guanche #language-Zulgo-Gemzek #language-Ganang #language-Ngangam #language-Lere #language-Gooniyandi #language-ǁGana #language-Gangulu #language-Ginuman #language-Gumatj #language-Northern Gondi #language-Gana #language-Gureng Gureng #language-Gnau #language-Guro #language-Playero #language-Gorakor #language-Godié #language-Gofa #language-Gogo #language-Old High German (ca. 750-1050) #language-Gobasi #language-Gola #language-Goan Konkani #language-Gondi #language-Yeretuar #language-Gorontalo #language-Gronings #language-Gothic #language-Gavar #language-Gorowa #language-Gobu #language-Gupa-Abawa #language-Ghanaian Pidgin English #language-Taiap #language-Ga'anda #language-Guiqiong #language-Guana (Brazil) #language-Gor #language-Qau #language-Rajput Garasia #language-Grebo #language-Ancient Greek (to 1453) #language-Guruntum-Mbaaru #language-Madi #language-Gbiri-Niragu #language-Ghari #language-Guarani #language-Groma #language-Gorovu #language-Taznatit #language-Gresi #language-Garo #language-Kistane #language-Guriaso #language-Gusilay #language-Nema #language-Southwest Gbaya #language-Wasembo #language-Swiss German #language-Guató #language-Aghu-Tharnggala #language-Shiki #language-Guajajára #language-Wayuu #language-Yocoboué Dida #language-Gurindji #language-Gupapuyngu #language-Paraguayan Guaraní #language-Guahibo #language-Eastern Bolivian Guaraní #language-Gujarati #language-Gumuz #language-Sea Island Creole English #language-Guambiano #language-Mbyá Guaraní #language-Guayabero #language-Gunwinggu #language-Aché #language-Farefare #language-Maléku Jaíka #language-Yanomamö #language-Gourmanchéma #language-Gusii #language-Guana (Paraguay) #language-Guanano #language-Duwet #language-Golin #language-Guajá #language-Gulay #language-Gurmana #language-Kuku-Yalanji #language-Gavião Do Jiparaná #language-Pará Gavião #language-Gurung #language-Gumawana #language-Mbato #language-Gawri #language-Gawwada #language-Gweno #language-Gowro #language-Gwichʼin #language-ǀGwi #language-Gwandara #language-Gwere #language-Gawar-Bati #language-Guwamu #language-Kwini #language-Gua #language-Wè Southern #language-Northwest Gbaya #language-Garus #language-Kayardild #language-Gyem #language-Gayil #language-Ngäbere #language-Guyanese Creole English #language-Gyalsumdo #language-Guarayu #language-Gunya #language-Ganza #language-Gazi #language-Gane #language-Han #language-Gurani #language-Hatam #language-Eastern Oromo #language-Hanga #language-Hahon #language-Haida #language-Hajong #language-Hakka Chinese #language-Halang #language-Hewa #language-Hangaza #language-Hakö #language-Ha #language-Harari #language-Haisla #language-Haitian #language-Hausa #language-Havu #language-Hawaiian #language-Southern Haida #language-Haya #language-Hazaragi #language-Huba #language-Heiban #language-Ancient Hebrew #language-Serbo-Croatian #language-Habu #language-Andaman Creole Hindi #language-Huichol #language-Northern Haida #language-Hadiyya #language-Northern Qiandong Miao #language-Hebrew #language-Herdé #language-Helong #language-Hehe #language-Heiltsuk #language-Hemba #language-Herero #language-Haiǁom #language-Lamang #language-Hibito #language-Hidatsa #language-Fiji Hindi #language-Kamwe #language-Pamosu #language-Seit-Kaitetu #language-Hiligaynon #language-Hindi #language-Tsoa #language-Hittite #language-Hiw #language-Hixkaryána #language-Kahe #language-Hunde #language-Hunjara-Kaina Ke #language-Halia #language-Halbi #language-Hieroglyphic Luwian #language-Humburi Senni Songhay #language-Large Flowery Miao #language-Luopohe Hmong #language-Hmong #language-Hiri Motu #language-Eastern Qiandong Miao #language-Hmar #language-Southern Qiandong Miao #language-Hamtai #language-Hamap #language-Hmong Dô #language-Mina (Cameroon) #language-Southern Hindko #language-Chhattisgarhi #language-Hungu #language-ǁAni #language-Hani #language-Hmong Njua #language-Hanunoo #language-Northern Hindko #language-Caribbean Hindustani #language-Hung #language-Hoava #language-Mari (Madang Province) #language-Ho #language-Holma #language-Horom #language-Hobyót #language-Holikachuk #language-Hadothi #language-Holu #language-Holoholo #language-Hopi #language-Hote #language-Hovongan #language-Honi #language-Hozo #language-Hpon #language-Hrangkhol #language-Niwer Mil #language-Hre #language-Haruku #language-Haroi #language-Hértevin #language-Hruso #language-Croatian #language-Warwar Feni #language-Hunsrik #language-Harzani #language-Upper Sorbian #language-Xiang Chinese #language-Harsusi #language-Hoti #language-Minica Huitoto #language-Hadza #language-Hitu #language-Middle Hittite #language-Huambisa #language-ǂHua #language-Huaulu #language-San Francisco Del Mar Huave #language-Humene #language-Huachipaeri #language-Huilliche #language-Huli #language-Hulung #language-Hula #language-Hungana #language-Hungarian #language-Hu #language-Hupa #language-Tsat #language-Halkomelem #language-Huastec #language-Humla #language-Murui Huitoto #language-San Mateo Del Mar Huave #language-Hukumina #language-Nüpode Huitoto #language-Hunzib #language-Haitian Vodoun Culture Language #language-San Dionisio Del Mar Huave #language-Sabu #language-Santa María Del Mar Huave #language-Wané #language-Hawai'i Creole English #language-Hwana #language-Hya #language-Armenian #language-Western Armenian #language-Iaai #language-Iatmul #language-Purari #language-Iban #language-Ibibio #language-Iwaidja #language-Akpes #language-Ibanag #language-Ibaloi #language-Ibino #language-Igbo #language-Ibuoro #language-Ibu #language-Ibani #language-Ede Ica #language-Etkywan #language-Idakho-Isukha-Tiriki #language-Indo-Portuguese #language-Idon #language-Ede Idaca #language-Idi #language-Ido #language-Indri #language-Idaté #language-Idoma #language-Amganad Ifugao #language-Batad Ifugao #language-Ifè #language-Ifo #language-Tuwali Ifugao #language-Teke-Fuumu #language-Mayoyao Ifugao #language-Keley-I Kallahan #language-Ebira #language-Igede #language-Igala #language-Kanggape #language-Ignaciano #language-Isebe #language-Interglossa #language-Iha #language-Sichuan Yi #language-Thiin #language-Izon #language-Biseni #language-Ede Ije #language-Kalabari #language-Southeast Ijo #language-Eastern Canadian Inuktitut #language-Iko #language-Ikulu #language-Olulumo-Ikom #language-Inuinnaqtun #language-Inuktitut #language-Ikwere #language-Ik #language-Ikizu #language-Ila #language-Interlingue #language-Ilongot #language-Iloko #language-Iranun (Philippines) #language-Ili'uun #language-Ilue #language-Anamgura #language-Miluk #language-Imonda #language-Imbongu #language-Imroing #language-Marsian #language-Interlingua (International Auxiliary Language Association) #language-Inga #language-Indonesian #language-Degexit'an #language-Ingush #language-Jungle Inga #language-Minaean #language-Isinai #language-Inoke-Yate #language-Iñapari #language-Intha #language-Ineseño #language-Inor #language-Tuma-Irumu #language-Iowa-Oto #language-Ipili #language-Inupiaq #language-Ipiko #language-Iquito #language-Iresim #language-Irarutu #language-Rigwe #language-Iraqw #language-Irántxe #language-Irula #language-Kamberau #language-Iraya #language-Isabi #language-Isconahua #language-Isnag #language-Nkem-Nkum #language-Ishkashimi #language-Icelandic #language-Isoko #language-Istriot #language-Isu (Menchum Division) #language-Italian #language-Binongan Itneg #language-Southern Tidung #language-Itene #language-Inlaod Itneg #language-Judeo-Italian #language-Itelmen #language-Itu Mbon Uzo #language-Itonama #language-Iteri #language-Itawit #language-Ito #language-Itik #language-Itzá #language-Iu Mien #language-Ibatan #language-Ivatan #language-Iwam #language-Sepik Iwam #language-Ixcatec #language-Ixil #language-Mesaka #language-Yaka (Congo) #language-Ingrian #language-Izere #language-Izii #language-Jamamadí #language-Hyam #language-Popti' #language-Yabem #language-Jara #language-Jah Hut #language-Zazao #language-Jakun #language-Yalahatan #language-Jamaican Creole English #language-Yanyuwa #language-Yaqay #language-Yaur #language-Javanese #language-Jambi Malay #language-Yan-nhangu #language-Jawe #language-Judeo-Berber #language-Badjiri #language-Arandai #language-Barikewa #language-Nafusi #language-Lojban #language-Jabutí #language-Jukun Takum #language-Krymchak #language-Judeo-Tat #language-Jebero #language-Jerung #language-Jeh #language-Yei #language-Jeri Kuo #language-Yelmek #language-Dza #language-Jere #language-Manem #language-Jonkor Bourmataguil #language-Judeo-Georgian #language-Gwak #language-Ngomba #language-Jehai #language-Jina #language-Jibu #language-Tol #language-Bu (Kaduna State) #language-Jingulu #language-Jiiddu #language-Jilim #language-Jimi (Cameroon) #language-Jiamao #language-Jita #language-Youle Jinuo #language-Shuar #language-Bankal #language-Kaera #language-Mobwa Karen #language-Kubo #language-Paku Karen #language-Labir #language-Ngile #language-Dima #language-Zumbun #language-Machame #language-Yamdena #language-Jimi (Nigeria) #language-Jumli #language-Kamara #language-Mashi (Nigeria) #language-Western Juxtlahuaca Mixtec #language-Jangshung #language-Yangman #language-Janji #language-Yemsa #language-Jaunsari #language-Joba #language-Wojenaka #language-Jorá #language-Jowulu #language-Jewish Palestinian Aramaic #language-Japanese #language-Jaqaru #language-Jarai #language-Judeo-Arabic #language-Jiru #language-Japrería #language-Júma #language-Wannu #language-Jurchen #language-Worodougou #language-Hõne #language-Wapan #language-Jirel #language-Jumjum #language-Juang #language-Jiba #language-Hupdë #language-Jurúna #language-Jutish #language-Ju #language-Wãpha #language-Juray #language-Caribbean Javanese #language-Jiarong #language-Kara-Kalpak #language-Kabyle #language-Kachin #language-Adara #language-Ketangalan #language-Kajaman #language-Kara (Central African Republic) #language-Karekare #language-Jju #language-Kalanguya #language-Kalaallisut #language-Kamba (Kenya) #language-Kannada #language-Xaasongaxango #language-Bezhta #language-Capanahua #language-Kashmiri #language-Georgian #language-Kanuri #language-Katukína #language-Kawi #language-Kamayurá #language-Kazakh #language-Kaxuiâna #language-Kadiwéu #language-Kabardian #language-Kanju #language-Camsá #language-Kaptiau #language-Kari #language-Grass Koiari #language-Kanembu #language-Iwal #language-Kare (Central African Republic) #language-Keliko #language-Kabiyè #language-Kamano #language-Kafa #language-Kande #language-Abadi #language-Dera (Indonesia) #language-Kaiep #language-Ap Ma #language-Manga Kanuri #language-Duhwa #language-Khanty #language-Kawacha #language-Ngkâlmpw Kanum #language-Ukaan #language-Tyap #language-Kamantan #language-Kobiana #language-Kalanga #language-Kela (Papua New Guinea) #language-Gula (Central African Republic) #language-Nubi #language-Kinalakna #language-Kanga #language-Katla #language-Kaian #language-Kami (Tanzania) #language-Kete #language-Kabwari #language-Kachama-Ganjule #language-Korandje #language-Worimi #language-Kutu #language-Yankunytjatjara #language-Makonde #language-Mamusi #language-Tem #language-Kumam #language-Karamojong #language-Numèè #language-Tsikimba #language-Kagoma #language-Kunda #language-Kaningdon-Nindem #language-Koch #language-Karaim #language-Kuy #language-Kadaru #language-Koneraw #language-Kam #language-Kwaja #language-Kabuverdianu #language-Kélé #language-Keiga #language-Kerewe #language-Eastern Keres #language-Kpessi #language-Tese #language-Kei #language-Kekchí #language-Kela (Democratic Republic of Congo) #language-Kemak #language-Kenyang #language-Kakwa #language-Kera #language-Kugbo #language-Ket #language-Akebu #language-West Kewa #language-Kukna #language-Kukele #language-Kodava #language-Northwestern Kolami #language-Konda-Dora #language-Korra Koraga #language-Kota (India) #language-Koya #language-Kannada Kurumba #language-Kinnauri #language-Khunsari #language-Kuk #language-Koro (Côte d'Ivoire) #language-Korwa #language-Korku #language-Kachhi #language-Kumaoni #language-Koromfé #language-Koyaga #language-Kawe #language-Komering #language-Kube #language-Kusunda #language-Gamale Kham #language-Kaiwá #language-Kunggari #language-Krongo #language-Kaingang #language-Kamoro #language-Abun #language-Kumbainggar #language-Somyev #language-Kobol #language-Karas #language-Karon Dori #language-Kamaru #language-Kyerung #language-Khasi #language-Lü #language-Tukang Besi North #language-Korowai #language-Khuen #language-Khams Tibetan #language-Kehu #language-Kuturmi #language-Halh Mongolian #language-Lusi #language-Khmer #language-Khandesi #language-Khotanese #language-Kapori #language-Koyra Chiini Songhay #language-Kharia #language-Kasua #language-Khamti #language-Nkhumbi #language-Khvarshi #language-Khowar #language-Kele (Democratic Republic of Congo) #language-Keapara #language-Kim #language-Koalib #language-Kickapoo #language-Koshin #language-Kibet #language-Kimaama #language-Kilmeri #language-Kitsai #language-Kilivila #language-Kikuyu #language-Kariya #language-Karagas #language-Kinyarwanda #language-Kiowa #language-Kosadle #language-Kirghiz #language-Kis #language-Agob #language-Kirmanjki (individual language) #language-Kimbu #language-Northeast Kiwai #language-Kirikiri #language-Kisi #language-Mlap #language-Q'anjob'al #language-Coastal Konjo #language-Southern Kiwai #language-Kisar #language-Khmu #language-Khakas #language-Zabana #language-Khinalugh #language-Highland Konjo #language-Western Parbate Kham #language-Kháng #language-Kunjen #language-Pwo Eastern Karen #language-Western Keres #language-Kurudu #language-East Kewa #language-Kashaya #language-Ramopa #language-Erave #language-Bumthangkha #language-Kakanda #language-Kwerisa #language-Odoodee #language-Kakabe #language-Khün #language-Kagulu #language-Kako #language-Kokota #language-Kosarek Yale #language-Kiong #language-Karko #language-Gugubera #language-Giiwo #language-Teke-Kukuya #language-Guugu Yimidhirr #language-Kaska #language-Klamath-Modoc #language-Kiliwa #language-Kolbila #language-Gamilaraay #language-Kulung (Nepal) #language-Tagakaulo #language-Weliki #language-Kalumpang #language-Khalaj #language-Kagan Kalagan #language-Migum #language-Kalenjin #language-Kamasa #language-Rumu #language-Khaling #language-Kalasha #language-Nukna #language-Klao #language-Maskelynes #language-Tado #language-Koluwawa #language-Kalao #language-Kabola #language-Konni #language-Kimbundu #language-Southern Dong #language-Majukayang Kalinga #language-Kare (Papua New Guinea) #language-Kâte #language-Kalam #language-Kami (Nigeria) #language-Limos Kalinga #language-Tanudan Kalinga #language-Kom (India) #language-Awtuw #language-Kwoma #language-Kwama #language-Northern Kurdish #language-Kamasau #language-Kemtuik #language-Kanite #language-Karipúna Creole French #language-Komo (Democratic Republic of Congo) #language-Waboda #language-Khorasani Turkish #language-Dera (Nigeria) #language-Lubuagan Kalinga #language-Central Kanuri #language-Konda #language-Kankanaey #language-Mankanya #language-Koongo #language-Kanufi #language-Western Kanjobal #language-Kuranko #language-Kanamarí #language-Konkani (individual language) #language-Kono (Sierra Leone) #language-Kwanja #language-Kintaq #language-Kaningra #language-Kensiu #language-Panoan Katukína #language-Tabo #language-Kung-Ekoka #language-Kendayan #language-Kanyok #language-Kodi #language-Kacipo-Bale Suri #language-Kubi #language-Cogui #language-Koyo #language-Komi-Permyak #language-Konkani (macrolanguage) #language-Kol (Papua New Guinea) #language-Kongo #language-Konzo #language-Waube #language-Kota (Gabon) #language-Korean #language-Kosraean #language-Lagwan #language-Koke #language-Kudu-Camo #language-Koyukon #language-Korak #language-Curripaco #language-Kpelle #language-Komba #language-Kapingamarangi #language-Kplang #language-Karajá #language-Kpan #language-Kpala #language-Koho #language-Kepkiriwát #language-Ikposo #language-Korupun-Sela #language-Korafe-Yegha #language-Tehit #language-Karata #language-Kafoa #language-Komi-Zyrian #language-Kobon #language-Mountain Koiali #language-Koryak #language-Kupsabiny #language-Mum #language-Kovai #language-Doromu-Koki #language-Kalagan #language-Kakabai #language-Koitabu #language-Koromira #language-Kyenele #language-Kaonde #language-Eastern Krahn #language-Krenak #language-Kimaragang #language-Northern Kissi #language-Klias River Kadazan #language-Seroa #language-Okolod #language-Kandas #language-Mser #language-Koorete #language-Korana #language-Karkin #language-Karachay-Balkar #language-Kairui-Midiki #language-Panará #language-Koro (Vanuatu) #language-Kurama #language-Krio #language-Kinaray-A #language-Kerek #language-Karelian #language-Durop #language-Krung #language-Gbaya (Sudan) #language-Kurukh #language-Karon #language-Kryts #language-Sota Kanum #language-Shambala #language-Southern Kalinga #language-Kuanua #language-Kuni #language-Bafia #language-Kusaghe #language-Kölsch #language-Krisa #language-Uare #language-Kansa #language-Kumalu #language-Kumba #language-Kasiguranin #language-Kaba #language-Kwaami #language-Borong #language-Southern Kisi #language-Winyé #language-Kusu #language-S'gaw Karen #language-Kedang #language-Kodaku #language-Kambaata #language-Kokata #language-Nubri #language-Kalkutung #language-Karanga #language-North Muyu #language-Plapo Krumen #language-Kaniet #language-Kurti #language-Karitiâna #language-Kuot #language-Kaduo #language-South Muyu #language-Ketum #language-Kituba (Democratic Republic of Congo) #language-Eastern Katu #language-Kato #language-Kaxararí #language-Juǀʼhoan #language-Kuanyama #language-Kutep #language-'Auhelawa #language-Kuman (Papua New Guinea) #language-Kupa #language-Kuikúro-Kalapálo #language-Kuria #language-Kulere #language-Kumyk #language-Kunama #language-Kumukio #language-Kunimaipa #language-Karipuna #language-Kusaal #language-Kutenai #language-Upper Kuskokwim #language-Kukatja #language-Kuuku-Ya'u #language-Kunza #language-Bagvalal #language-Kubu #language-Kove #language-Kui (Indonesia) #language-Kalabakan #language-Kabalai #language-Kuni-Boazi #language-Komodo #language-Psikye #language-Kendem #language-Border Kuna #language-Dobel #language-Geba Karen #language-Kerinci #language-Yinbaw Karen #language-Kola #language-Wersing #language-Yintale Karen #language-Dâw #language-Kwa #language-Kwaio #language-Kwerba #language-Kwara'ae #language-Sara Kaba Deme #language-Kowiai #language-Awa-Cuaiquer #language-Kwanga #language-Kwakiutl #language-Kofyar #language-Kwangali #language-Kwomtari #language-Kodia #language-Kwese #language-Sara Kaba Náà #language-San Salvador Kongo #language-Kwadi #language-Kairiru #language-Krobu #language-Konso #language-Brunei #language-Manumanaw Karen #language-Karo (Ethiopia) #language-Keningau Murut #language-Kulfa #language-Northern Khmer #language-Kanowit-Tanjong Melanau #language-Kanoé #language-Kuvi #language-Konai #language-Kerewo #language-Kwaya #language-Kyaka #language-Krache #language-Kouya #language-Karok #language-Kiput #language-Karao #language-Kamayo #language-Kalapuya #language-Kpatili #language-Kelon #language-Kenga #language-Kuruáya #language-Baram Kayan #language-Kayagar #language-Western Kayah #language-Rapoisi #language-Kayabí #language-Western Karaboro #language-Kaibobo #language-Bondoukou Kulango #language-Kadai #language-Kosena #language-Da'a Kaili #language-Kikai #language-Kelabit #language-Kazukuru #language-Kayeli #language-Kais #language-Kokola #language-Kaningi #language-Kaidipang #language-Kaike #language-Karang #language-Sugut Dusun #language-Kayupulau #language-Karirí-Xocó #language-Kamarian #language-Kalabra #language-Southern Subanen #language-Lacandon #language-Ladino #language-Pattani #language-Lafofa #language-Rangi #language-Lahnda #language-Lambya #language-Lango (Uganda) #language-Lamba #language-Laru #language-Lao #language-Laka (Chad) #language-Qabiao #language-Larteh #language-Lama (Togo) #language-Latin #language-Laba #language-Lauje #language-Tiwa #language-Aribwatsa #language-Label #language-Lakkia #language-Lak #language-Tinani #language-Ladakhi #language-Central Bontok #language-Libon Bikol #language-Rmeet #language-Laven #language-Wampar #language-Lohorung #language-Lachi #language-Labu #language-Lavatbura-Lamusong #language-Tolaki #language-Lawangan #language-Lamalama #language-Lardil #language-Legenyem #language-Luchazi #language-Tungag #language-Western Lawa #language-Luhu #language-Lisabata-Nuniali #language-Kla-Dan #language-Dũya #language-Laari #language-Lemoro #language-Leelau #language-Kaan #language-Landoma #language-Láadan #language-Lega-Shabunda #language-Lala-Bisa #language-Leco #language-Lendu #language-Lyélé #language-Lelemi #language-Lenje #language-Lemio #language-Lengola #language-Leipon #language-Nomaande #language-Lenca #language-Lepcha #language-Lembena #language-Lenkau #language-Lese #language-Lesing-Gelimi #language-Kara (Papua New Guinea) #language-Lamma #language-Ledo Kaili #language-Luang #language-Lezghian #language-Lefa #language-Lingua Franca Nova #language-Lungga #language-Laghu #language-Lugbara #language-Laghuu #language-Lengilu #language-Lingarak #language-Wala #language-T'apo #language-Logba #language-Lengo #language-Pahi #language-Longgu #language-Laha (Viet Nam) #language-Laha (Indonesia) #language-Lahu Shi #language-Lhomi #language-Lahanan #language-Mlahsö #language-Lo-Toga #language-Lahu #language-West-Central Limba #language-Likum #language-Hlai #language-Nyindrou #language-Limbu #language-Ligbi #language-Lihir #language-Ligurian #language-Lillooet #language-Limburgan #language-Lingala #language-Liki #language-Sekpele #language-Libido #language-Liberian English #language-Lisu #language-Lithuanian #language-Logorik #language-Liv #language-Liabuku #language-Banda-Bambari #language-Libinza #language-Golpa #language-Rampi #language-Laiyolo #language-Li'o #language-Lampung Api #language-Lakalei #language-Kabras #language-Lakondê #language-Kenyi #language-Lakha #language-Laki #language-Laeko-Libuat #language-Lakon #language-Päri #language-Lakota #language-Kungkari #language-Lokoya #language-Lala-Roba #language-Lele (Guinea) #language-Ladin #language-Lele (Papua New Guinea) #language-Hermit #language-Lole #language-Teke-Laali #language-Lelak #language-Lilau #language-Lasalimu #language-Lele (Chad) #language-North Efate #language-Lau #language-East Limba #language-Merei #language-Limilngan #language-Pévé #language-Lamogai #language-Lambichhong #language-Lamkang #language-Hano #language-Lambadi #language-Lombard #language-Limbum #language-Lamalera #language-Lamenu #language-Lake Miwok #language-Laimbue #language-Lamboya #language-Langbashe #language-Mbalanhu #language-Lundayeh #language-Lanoh #language-Leningitij #language-South Central Banda #language-Lorediakarkar #language-Lamnso' #language-Longuda #language-Loloda #language-Lobi #language-Inonhan #language-Saluan #language-Logol #language-Logo #language-Laarim #language-Loma (Côte d'Ivoire) #language-Lou #language-Loko #language-Mongo #language-Loma (Liberia) #language-Malawi Lomwe #language-Lombo #language-Lopa #language-Téén #language-Loniu #language-Otuho #language-Louisiana Creole #language-Loke #language-Lozi #language-Lelepa #language-Lepki #language-Rara Bakati' #language-Northern Luri #language-Laurentian #language-Laragia #language-Lari #language-Laro #language-Larevat #language-Lemerig #language-Lasgerdi #language-Lishana Deni #language-Lusengo #language-Lish #language-Lashi #language-Saamia #language-Aruop #language-Late Middle Chinese #language-Latgalian #language-Leti (Indonesia) #language-Latundê #language-Luxembourgish #language-Luba-Lulua #language-Luba-Katanga #language-Aringa #language-Ludian #language-Luvale #language-Laua #language-Ganda #language-Luiseno #language-Luna #language-Olu'bo #language-Luimbi #language-Lunda #language-Luo (Kenya and Tanzania) #language-Lumbu #language-Lucumi #language-Laura #language-Lushai #language-Lushootseed #language-Luo (Cameroon) #language-Luyia #language-Maku'a #language-Lavukaleve #language-Standard Latvian #language-White Lachi #language-Eastern Lawa #language-Luwo #language-Lewo #language-Lyngngam #language-Luyana #language-Literary Chinese #language-Litzlitz #language-Laz #language-San Jerónimo Tecóatl Mazatec #language-Madurese #language-Bo-Rukul #language-Mafa #language-Magahi #language-Marshallese #language-Maithili #language-Jalapa De Díaz Mazatec #language-Makasar #language-Malayalam #language-Mam #language-Mandingo #language-Chiquihuitlán Mazatec #language-Marathi #language-Masai #language-San Francisco Matlatzinca #language-Huautla Mazatec #language-Sateré-Mawé #language-Mampruli #language-North Moluccan Malay #language-Central Mazahua #language-Western Bukidnon Manobo #language-Macushi #language-Dibabawon Manobo #language-Molale #language-Baba Malay #language-Mangseng #language-Ilianen Manobo #language-Nadëb #language-Maxakalí #language-Ombamba #language-Macaguán #language-Mbo (Cameroon) #language-Malayo #language-Maisin #language-Nukak Makú #language-Sarangani Manobo #language-Matigsalug Manobo #language-Mbula-Bwazza #language-Mbulungish #language-Maring #language-Mari (East Sepik Province) #language-Amoltepec Mixtec #language-Maca #language-Machiguenga #language-Bitur #language-Sharanahua #language-Matsés #language-Mapoyo #language-Maquiritari #language-Mese #language-Mvanip #language-Mbunda #language-Malaccan Creole Portuguese #language-Masana #language-Coatlán Mixe #language-Makaa #language-Ese #language-Menya #language-Mambai #language-Mengisa #language-Cameroon Mambila #language-Minanibai #language-Mawa (Chad) #language-Mpiemo #language-South Watut #language-Mawan #language-Mada (Nigeria) #language-Morigi #language-Male (Papua New Guinea) #language-Mbum #language-Maba (Chad) #language-Moksha #language-Maguindanaon #language-Mamvu #language-Mangbetu #language-Mayogo #language-Mbala #language-Mbole #language-Mandar #language-Maria (Papua New Guinea) #language-Mbere #language-Santa Lucía Monteverde Mixtec #language-Mbosi #language-Dizin #language-Male (Ethiopia) #language-Suruí Do Pará #language-Menka #language-Ikobi #language-Marra #language-Melpa #language-Mengen #language-Megam #language-Midob #language-Meyah #language-Mekeo #language-Central Melanau #language-Mangala #language-Mende (Sierra Leone) #language-Kedah Malay #language-Miriwoong #language-Merey #language-Meru #language-Mato #language-Motu #language-Mano #language-Maaka #language-Hassaniyya #language-Menominee #language-Pattani Malay #language-Bangka #language-Mba #language-Mendankwe-Nkwen #language-Morisyen #language-Naki #language-Mogofin #language-Matal #language-Wandala #language-Mefele #language-Putai #language-Marghi South #language-Cross River Mbembe #language-Mbe #language-Makassar Malay #language-Moba #language-Marrithiyel #language-Mokerang #language-Mbwela #language-Mandjak #language-Mulaha #language-Melo #language-Mayo #language-Mabaan #language-Middle Irish (900-1200) #language-Mararit #language-Morokodo #language-Moru #language-Mango #language-Maklew #language-Makhuwa-Meetto #language-Lijili #language-Mawes #language-Maleu-Kilenge #language-Mambae #language-Meta' #language-Eastern Magar #language-Malila #language-Mambwe-Lungu #language-Manda (Tanzania) #language-Mailu #language-Matengo #language-Matumbi #language-Mbunga #language-Mbugwe #language-Manda (India) #language-Mahongwe #language-Mocho #language-Mbugu #language-Besisi #language-Mamaa #language-Margu #language-Ma'di #language-Mogholi #language-Mungaka #language-Mauwake #language-Mócheno #language-Mashi (Zambia) #language-Mandan #language-Eastern Mari #language-Buru (Indonesia) #language-Mandahuaca #language-Digaro-Mishmi #language-Mbukushu #language-Maru #language-Ma'anyan #language-Mor (Mor Islands) #language-Miami #language-Atatláhuca Mixtec #language-Mi'kmaq #language-Mandaic #language-Ocotepec Mixtec #language-Mofu-Gudur #language-San Miguel El Grande Mixtec #language-Chayuco Mixtec #language-Abar #language-Mikasuki #language-Peñoles Mixtec #language-Alacatlatzala Mixtec #language-Minangkabau #language-Pinotepa Nacional Mixtec #language-Apasco-Apoala Mixtec #language-Mískito #language-Isthmus Mixe #language-Southern Puebla Mixtec #language-Cacaloxtepec Mixtec #language-Akoye #language-Mixtepec Mixtec #language-San Juan Colorado Mixtec #language-Northwest Maidu #language-Tu #language-Mwera (Nyasa) #language-Kim Mun #language-Mawak #language-Matukar #language-Medebur #language-Miship #language-Sauria Paharia #language-Karbi #language-Mahali #language-Mahican #language-Mbre #language-Siliput #language-Macedonian #language-Mawchi #language-Miya #language-Mak (China) #language-Dhatki #language-Mokilese #language-Mokole #language-Moklen #language-Kupang Malay #language-Moikodi #language-Bay Miwok #language-Malas #language-Silacayoapan Mixtec #language-Mafea #language-Kituba (Congo) #language-East Makian #language-Makasae #language-Malo #language-Cao Lan #language-Manambu #language-Mal #language-Malagasy #language-Mape #language-Miltu #language-Ilwana #language-Malua Bay #language-Mulam #language-Malango #language-Bargam #language-Western Maninkakan #language-Vame #language-Masalit #language-Maltese #language-To'abaita #language-Motlav #language-Moloko #language-Malfaxal #language-Mama #language-Momina #language-Maonan #language-Mae #language-Mundat #language-North Ambrym #language-Mehináku #language-Musar #language-Maii #language-Mamanwa #language-Mangga Buang #language-Siawi #language-Musak #language-Western Xiangxi Miao #language-Malalamai #language-Mmaala #language-Emae #language-Madak #language-Migaama #language-Mabaale #language-Mbula #language-Muna #language-Manchu #language-Mondé #language-Naba #language-Mundani #language-Eastern Mnong #language-Mono (Democratic Republic of Congo) #language-Manipuri #language-Munji #language-Mandinka #language-Tiale #language-Mapena #language-Min Bei Chinese #language-Minriq #language-Mono (USA) #language-Mansi #language-Rennell-Bellona #language-Mon #language-Manikion #language-Manyawa #language-Moni #language-Mwan #language-Mocoví #language-Mobilian #language-Innu #language-Mongondow #language-Mohawk #language-Mboi #language-Monzombo #language-Morori #language-Mangue #language-Mopán Maya #language-Mor (Bomberai Peninsula) #language-Moro #language-Mossi #language-Barí #language-Mogum #language-Mohave #language-Molima #language-Shekkacho #language-Mukulu #language-Mpoto #language-Malak Malak #language-Mangarrayi #language-Machinere #language-Majang #language-Marba #language-Maung #language-Mpade #language-Martu Wangka #language-Mbara (Chad) #language-Middle Watut #language-Yosondúa Mixtec #language-Mindiri #language-Migabac #language-Matís #language-Vangunu #language-Dadibi #language-Mian #language-Makuráp #language-Mungkip #language-Mapidian #language-Misima-Panaeati #language-Mapia #language-Mpi #language-Maba (Indonesia) #language-Mbuko #language-Mangole #language-Matepi #language-Momuna #language-Mamasa #language-Rajah Kabunsuwan Manobo #language-Mbelime #language-South Marquesan #language-Moronene #language-Modole #language-Manipa #language-Mander #language-West Makian #language-Mandari #language-Mosimo #language-Murupi #language-Mamuju #language-Manggarai #language-Pano #language-Mlabri #language-Marino #language-Maricopa #language-Elseng #language-Mising #language-Mara Chin #language-Maori #language-Western Mari #language-Hmwaveke #language-Mortlockese #language-Merlav #language-Cheke Holo #language-Mru #language-Morouas #language-North Marquesan #language-Maria (India) #language-Maragus #language-Marghi Central #language-Mono (Cameroon) #language-Mangareva #language-Maranao #language-Maremgi #language-Mandaya #language-Marind #language-Malay (macrolanguage) #language-Masbatenyo #language-Musey #language-Mekwei #language-Moraid #language-Masikoro Malagasy #language-Ma (Democratic Republic of Congo) #language-Mansaka #language-Molof #language-Agusan Manobo #language-Vurës #language-Mombum #language-West Masela #language-Musom #language-Mansoanka #language-Moresada #language-Aruamu #language-Momare #language-Cotabato Manobo #language-Munit #language-Mualang #language-Mono (Solomon Islands) #language-Murik (Papua New Guinea) #language-Una #language-Munggui #language-Maiwa (Papua New Guinea) #language-Moskona #language-Mbe' #language-Montol #language-Mator #language-Matagalpa #language-Totontepec Mixe #language-Wichí Lhamtés Nocten #language-Muong #language-Mewari #language-Yora #language-Mota #language-Tututepec Mixtec #language-Asaro'o #language-Nabi #language-Mundang #language-Mubi #language-Ajumbu #language-Media Lengua #language-Musgu #language-Mündü #language-Musi #language-Mugom #language-Multiple languages #language-Nyong #language-Malvi #language-Murle #language-Creek #language-Yaaku #language-Bo-Ung #language-Muyang #language-Mursi #language-Manam #language-Mattole #language-Mamboru #language-Marwari (Pakistan) #language-Peripheral Mongolian #language-Miyako #language-Minaveha #language-Marovo #language-Duri #language-Moere #language-Marau #language-Mpotovoro #language-Marfa #language-Tagal Murut #language-Meoswar #language-Indus Kohistani #language-Mesqan #language-Mwatebu #language-Juwal #language-Are #language-Mwera (Chimwera) #language-Murrinh-Patha #language-Aiklep #language-Mouk-Aria #language-Labo #language-Mirandese #language-Sar #language-Nyamwanga #language-Central Maewo #language-Kala Lagaw Ya #language-Mün Chin #language-Marwari #language-Moken #language-Mentawai #language-Hmong Daw #language-Tezoatlán Mixtec #language-Manyika #language-Modang #language-Mele-Fila #language-Malgbe #language-Mbangala #language-Mozarabic #language-Miju-Mishmi #language-Monumbo #language-Meramera #language-Moi (Indonesia) #language-Tlahuitoltepec Mixe #language-Juquila Mixe #language-Murik (Malaysia) #language-Jamiltepec Mixtec #language-Mada (Cameroon) #language-Metlatónoc Mixtec #language-Namo #language-Mahou #language-Southeastern Nochixtlán Mixtec #language-Central Masela #language-Burmese #language-Mbay #language-Myene #language-Bambassi #language-Manta #language-Makah #language-Mangayat #language-Mamara Senoufo #language-Moma #language-Me'en #language-Anfillo #language-Pirahã #language-Muniche #language-Mesmes #language-Mundurukú #language-Erzya #language-Muyuw #language-Masaaba #language-Macuna #language-Classical Mandaic #language-Tumzabt #language-Malimba #language-Morawa #language-Wichí Lhamtés Güisnay #language-Ixcatlán Mazatec #language-Manya #language-Nigeria Mambila #language-Mumuye #language-Mazanderani #language-Matipuhy #language-Movima #language-Mori Atas #language-Marúbo #language-Macanese #language-Mintil #language-Inapang #language-Manza #language-Deg #language-Maiadomu #language-Southern Nambikuára #language-Narak #language-Naka'ela #language-Nabak #language-Naga Pidgin #language-Nalu #language-Nakanai #language-Nalik #language-Ngan'gityemerri #language-Neapolitan #language-Khoekhoe #language-Iguta #language-Naasioi #language-Ca̱hungwa̱rya̱ #language-Nauru #language-Navajo #language-Nawuri #language-Nakwi #language-Ngarrindjeri #language-Coatepec Nahuatl #language-Nyemba #language-Ndoe #language-Chang Naga #language-Konyak Naga #language-Ngamo #language-Mao Naga #language-Ngarinyman #language-Nake #language-South Ndebele #language-Ngbaka Ma'bo #language-Kuri #language-Nkukoli #language-Nnam #language-Nggem #language-Numana #language-Rongmei Naga #language-Ngamambo #language-Southern Ngbandi #language-Ningera #language-Iyo #language-Central Nicobarese #language-Ponam #language-Nachering #language-Yale #language-Notsi #language-Nisga'a #language-Central Huasteca Nahuatl #language-Classical Nahuatl #language-Northern Puebla Nahuatl #language-Na-kara #language-Michoacán Nahuatl #language-Nambo #language-Nauna #language-Sibe #language-Ncane #language-Chothe Naga #language-Chumburung #language-Central Puebla Nahuatl #language-Natchez #language-Ndasa #language-Kenswei Nsei #language-Ndau #language-Nde-Nsele-Nta #language-North Ndebele #language-Ndengereko #language-Ndali #language-Samba Leko #language-Ndamba #language-Ndam #language-Ngundi #language-Ndonga #language-Ndombe #language-Ndoola #language-Low German #language-Ndunga #language-Dugun #language-Ndut #language-Nduga #language-Lutos #language-Ndogo #language-Toura (Côte d'Ivoire) #language-Nedebang #language-Nêlêmwa-Nixumwak #language-Negidal #language-Nyenkha #language-Neko #language-Nemi #language-Nengone #language-Ná-Meo #language-North Central Mixe #language-Yahadian #language-Nete #language-Neo #language-Nyaheun #language-Newari #language-Neyo #language-Nez Perce #language-Dhao #language-Ahwai #language-Ayiwo #language-Nafaanra #language-Mfumte #language-Ngbaka #language-Northern Ngbandi #language-Ngombe (Democratic Republic of Congo) #language-Ngemba #language-Ngbaka Manza #language-Nǁng #language-Ngizim #language-Ngie #language-Dalabon #language-Lomwe #language-Ngwo #language-Ngulu #language-Ngurimi #language-Engdewu #language-Gvoko #language-Kriang #language-Guerrero Nahuatl #language-Nagumi #language-Nggwahyi #language-Tibea #language-Ngungwel #language-Nhanda #language-Beng #language-Tabasco Nahuatl #language-Chiripá #language-Eastern Huasteca Nahuatl #language-Nhuwala #language-Tetelcingo Nahuatl #language-Zacatlán-Ahuacatlán-Tepetzintla Nahuatl #language-Morelos Nahuatl #language-Central Nahuatl #language-Takuu #language-Isthmus-Pajapan Nahuatl #language-Huaxcaleca Nahuatl #language-Naro #language-Ometepec Nahuatl #language-Noone #language-Temascaltepec Nahuatl #language-Western Huasteca Nahuatl #language-Isthmus-Mecayapan Nahuatl #language-Northern Oaxaca Nahuatl #language-Nias #language-Nakame #language-Ngandi #language-Niellim #language-Nek #language-Ngalakgan #language-Nyiha (Tanzania) #language-Nii #language-Ngaju #language-Nila #language-Nilamba #language-Ninzo #language-Nganasan #language-Nandi #language-Nimboran #language-Nimi #language-Southeastern Kolami #language-Niuean #language-Gilyak #language-Nimo #language-Ngiti #language-Ningil #language-Nzanyi #language-Nocte Naga #language-Lotha Naga #language-Gudanji #language-Njen #language-Njalgulgule #language-Angami Naga #language-Liangmai Naga #language-Ao Naga #language-Njerep #language-Ngadjunmaya #language-Njyem #language-Nyishi #language-Nkoya #language-Khoibu Naga #language-Nkongho #language-Koireng #language-Duke #language-Inpui Naga #language-Nekgini #language-Khezha Naga #language-Thangal Naga #language-Nakai #language-Nokuku #language-Nkonya #language-Niuatoputapu #language-Nukuoro #language-North Asmat #language-Bouna Kulango #language-Nkutu #language-Nkoroo #language-Nkari #language-Ngombale #language-Nalca #language-Dutch #language-East Nyala #language-Gela #language-Grangali #language-Nyali #language-Nihali #language-Ngul #language-Nchumbulu #language-Nahali #language-Nyamal #language-Nalögo #language-Maram Naga #language-Big Nambas #language-Ngam #language-Ndumu #language-Mzieme Naga #language-Tangkhul Naga (India) #language-Kwasio #language-Monsang Naga #language-Ngombe (Central African Republic) #language-Namakura #language-Ndemli #language-Manangba #language-ǃXóõ #language-Nimanbur #language-Letemboi #language-Namonuito #language-Northeast Maidu #language-Ngamini #language-Nimoa #language-Nama (Papua New Guinea) #language-Namuyi #language-Nawdm #language-Nyangumarta #language-Nande #language-Nancere #language-West Ambae #language-Ngaing #language-Maring Naga #language-Ngiemboon #language-Nyangatom #language-Nankina #language-Northern Rengma Naga #language-Namia #language-Norwegian Nynorsk #language-Wancho Naga #language-Ngindo #language-Narungga #language-Nanticoke #language-Dwang #language-Nugunu (Australia) #language-Southern Nuni #language-Nyangga #language-Nda'nda' #language-Woun Meu #language-Norwegian Bokmål #language-Nuk #language-Northern Thai #language-Nimadi #language-Nogai #language-Nomu #language-Nonuya #language-Nooksack #language-Nomlaki #language-nom #language-Old Norse #language-Numanggang #language-Ngongo #language-Norwegian #language-Nomatsiguenga #language-Ewage-Notu #language-Novial #language-Nyambo #language-Nayi #language-Phom Naga #language-Nepali (individual language) #language-Southeastern Puebla Nahuatl #language-Mondropolon #language-Pochuri Naga #language-Nipsan #language-Napu #language-Southern Nago #language-Kura Ede Nago #language-Ndom #language-N'Ko #language-Ngom #language-Nara #language-Southern Rengma Naga #language-Jèrriais #language-Narango #language-Chokri Naga #language-Ngarla #language-Ngarluma #language-Narom #language-Norn #language-Northern Kalapuya #language-Narua #language-Lala #language-Sangtam Naga #language-Lower Nossob #language-Nsenga #language-Ngasa #language-Naskapi #language-Sumi Naga #language-Nehan #language-Pedi #language-Northern Sierra Miwok #language-Nali #language-Tase Naga #language-Sierra Negra Nahuatl #language-Navut #language-Nsongo #language-Nasal #language-Nisenan #language-Natioro #language-Ngaanyatjarra #language-Ikoma-Nata-Isenye #language-Nateni #language-Ntomba #language-Northern Tepehuan #language-Natügu #language-Nottoway #language-Mantsi #language-Natanzi #language-Yuanga #language-Nukuini #language-Ngala #language-Nusu #language-Nungali #language-Ngumbi #language-Nyole #language-Nuu-chah-nulth #language-Nusa Laut #language-Niuafo'ou #language-Anong #language-Nguôn #language-Nupe-Nupe-Tako #language-Nukuria #language-Nuer #language-Nung (Viet Nam) #language-Nguluwan #language-Mehek #language-Nunggubuyu #language-Tlamacazapa Nahuatl #language-Nasarian #language-Nyokon #language-Nawathinehena #language-Nyabwa #language-Ngwe #language-Southwest Tanna #language-Nawaru #language-Nottoway-Meherrin #language-Nauete #language-Nage #language-Ngad'a #language-Nindi #language-South Nuaulu #language-Ngawun #language-Naxi #language-Ninggerum #language-Nafri #language-Nyanja #language-Nyangbo #language-Nyore #language-Nyengo #language-Giryama #language-Nyikina #language-Ama (Sudan) #language-Nyanga #language-Nyaneka #language-Nyeu #language-Nyamwezi #language-Nyankole #language-Nyoro #language-Nyang'i #language-Nayini #language-Nyungar #language-Nyawaygi #language-Nyungwe #language-Nyulnyul #language-Nganyaywana #language-Nyakyusa-Ngonde #language-Tigon Mbembe #language-Njebi #language-Nzadi #language-Nzima #language-Nzakara #language-Zeme Naga #language-Nzakambay #language-Nanga Dama Dogon #language-Orok #language-Oroch #language-Old Aramaic (up to 700 BCE) #language-Obispeño #language-Southern Bontok #language-Moabite #language-Obo Manobo #language-Old Burmese #language-Old Breton #language-Ocaina #language-Old Chinese #language-Occitan (post 1500) #language-Old Cornish #language-Atzingo Matlatzinca #language-Old Dutch #language-Odual #language-Ofo #language-Old Frisian #language-Efutop #language-Ogbia #language-Ogbah #language-Old Georgian #language-Khana #language-Ogbronuagum #language-Old Hittite #language-Oirata #language-Northwestern Ojibwa #language-Central Ojibwa #language-Eastern Ojibwa #language-Ojibwa #language-Old Japanese #language-Severn Ojibwa #language-Ontong Java #language-Western Ojibwa #language-Okanagan #language-Okobo #language-Okodia #language-Okpe (Southwestern Edo) #language-Okiek #language-Oko-Juwoi #language-Oki-No-Erabu #language-Old Korean (3rd-9th cent.) #language-Kirike #language-Oko-Eni-Osayen #language-Oku #language-Orokaiva #language-Walungge #language-Mochi #language-Olekha #language-Oloma #language-Livvi #language-Old Lithuanian #language-Omaha-Ponca #language-East Ambae #language-Mochica #language-Omagua #language-Omok #language-Ombo #language-Utarmbung #language-Old Marathi #language-Omotik #language-Omurano #language-South Tairora #language-Old Mon #language-Ona #language-Lingao #language-Oneida #language-Olo #language-Onin #language-Onjob #language-Onobasulu #language-Onondaga #language-Sartang #language-Ono #language-Unua #language-Old Nubian #language-Tohono O'odham #language-Ong #language-Önge #language-Okpamheri #language-Kopkaka #language-Oksapmin #language-Opao #language-Opata #language-Ofayé #language-Oroha #language-Orma #language-Orejón #language-Oring #language-Oroqen #language-Oriya (macrolanguage) #language-Oromo #language-Orokolo #language-Oruma #language-Adivasi Oriya #language-Ormuri #language-Old Russian #language-Oro Win #language-Oro #language-Odia #language-Ormu #language-Osage #language-Oscan #language-Osing #language-Old Spanish #language-Ossetian #language-Old Saxon #language-Ottoman Turkish (1500-1928) #language-Ot Danum #language-Mezquital Otomi #language-Oti #language-Old Turkish #language-Tilapa Otomi #language-Eastern Highland Otomi #language-Querétaro Otomi #language-Otoro #language-Estado de México Otomi #language-Temoaya Otomi #language-Otuke #language-Ottawa #language-Ixtenco Otomi #language-Tagargrent #language-Glio-Oubi #language-Ouma #language-Elfdalian #language-Owiniga #language-Old Welsh #language-Oy #language-Oyda #language-Wayampi #language-Koonzime #language-Parecís #language-Pacoh #language-Paumarí #language-Pagibete #language-Paranawát #language-Pangasinan #language-Tenharim #language-Pe #language-Parakanã #language-Pahlavi #language-Pampanga #language-Panjabi #language-Northern Paiute #language-Papiamento #language-Parya #language-Panamint #language-Papasena #language-Palauan #language-Pakaásnovos #language-Pawnee #language-Pech #language-Pankararú #language-Páez #language-Mezontla Popoloca #language-Coyotepec Popoloca #language-Paraujano #language-E'ñapa Woromaipu #language-Parkwa #language-Kpasam #language-Papel #language-Badyara #language-Pangwa #language-Central Pame #language-Southern Pashto #language-Northern Pashto #language-Pnar #language-Pyu (Papua New Guinea) #language-Santa Inés Ahuatempan Popoloca #language-Pear #language-Bouyei #language-Picard #language-Ruching Palaung #language-Duruwa #language-Parenga #language-Paite Chin #language-Nigerian Pidgin #language-Piti #language-Pacahuara #language-Anam #language-Pennsylvania German #language-Padoe #language-Plautdietsch #language-Kayan #language-Eastern Pomo #language-Mala (Papua New Guinea) #language-Taje #language-Northeastern Pomo #language-Pengo #language-Bonan #language-Chichimeca-Jonaz #language-Northern Pomo #language-Penchal #language-Phende #language-Old Persian (ca. 600-400 B.C.) #language-Kunja #language-Southern Pomo #language-Iranian Persian #language-Petats #language-Eastern Penan #language-Pere #language-Pfaelzisch #language-Sudanese Creole Arabic #language-Gāndhārī #language-Rerep #language-Paelignian #language-Pagu #language-Pa-Hng #language-Phuong #language-Phake #language-Phalura #language-Phoenician #language-Phunoi #language-Phu Thai #language-Pima Bajo #language-Yine #language-Pinji #language-Piaroa #language-Piro #language-Pingelapese #language-Pitcairn-Norfolk #language-Pijao #language-Yom #language-Powhatan #language-Piame #language-Piapoco #language-Pero #language-Piratapuyo #language-Pijin #language-Pitta Pitta #language-Pintupi-Luritja #language-Pileni #language-Pimbwe #language-Piu #language-Pije #language-Pitjantjatjara #language-Ardhamāgadhī Prākrit #language-Pokomo #language-Paekche #language-Pak-Tong #language-Pankhu #language-Pakanha #language-Pökoot #language-Pukapuka #language-Maleng #language-Paku #language-Miani #language-Polonombauk #language-Central Palawano #language-Palu'e #language-Pilagá #language-Paulohi #language-Pali #language-plj #language-Kohistani Shina #language-Shwe Palaung #language-Palenquero #language-Oluta Popoluca #language-Palaic #language-Palaka Senoufo #language-San Marcos Tlacoyalco Popoloca #language-Plateau Malagasy #language-Palikúr #language-Southwest Palawano #language-Brooke's Point Palawano #language-Bolyu #language-Paluan #language-Paama #language-Pambia #language-Pamona #language-Māhārāṣṭri Prākrit #language-Northern Pumi #language-Pamlico #language-Lingua Franca #language-Pomo #language-Pam #language-Pom #language-Northern Pame #language-Paynamar #language-Piemontese #language-Tuamotuan #language-Plains Miwok #language-Papuan Malay #language-Southern Pame #language-Punan Bah-Biau #language-Western Panjabi #language-Western Penan #language-Pangu #language-Penrhyn #language-Aoheng #language-Pinjarup #language-Pinai-Hagahai #language-Panobo #language-Pancana #language-Pana (Burkina Faso) #language-Panim #language-Ponosakan #language-Pontic #language-Jiongnai Bunu #language-Pinigura #language-Banyjima #language-Pinyin #language-Poqomam #language-San Juan Atzingo Popoloca #language-Poke #language-Potiguára #language-Poqomchi' #language-Highland Popoluca #language-Pokangá #language-Polish #language-Southeastern Pomo #language-Pohnpeian #language-Central Pomo #language-Texistepec Popoluca #language-Portuguese #language-Sayula Popoluca #language-Potawatomi #language-Upper Guinea Crioulo #language-San Felipe Otlaltepec Popoloca #language-Polabian #language-Pogolo #language-Papi #language-Paipai #language-Uma #language-Pipil #language-Papuma #language-Papapana #language-Folopa #language-Pei #language-Pare #language-Papora #language-Pa'a #language-Malecite-Passamaquoddy #language-Parachi #language-Parsi-Dari #language-Principense #language-Paranan #language-Prussian #language-Porohanon #language-Paicî #language-Parauk #language-Kibiri #language-Prasuni #language-Old Provençal (to 1500) #language-Ashéninka Perené #language-Puri #language-Dari #language-Puragi #language-Parawen #language-Purik #language-Asue Awyu #language-Central Malay #language-Southwest Pashai #language-Southeast Pashai #language-Pauserna #language-Panasuan #language-Pasi #language-Kaulong #language-Central Pashto #language-Sauraseni Prākrit #language-Port Sandwich #language-Pai Tavytera #language-Pataxó Hã-Ha-Hãe #language-Pindiini #language-Patani #language-Zo'é #language-Patep #language-Piamatsina #language-Bambam #language-Port Vato #language-Western Highland Purepecha #language-Purum #language-Punan Merap #language-Puelche #language-Puinave #language-Punan Tubu #language-Puma #language-Puoc #language-Pulabu #language-Puquina #language-Puruborá #language-Punu #language-Puluwatese #language-Purisimeño #language-Pawaia #language-Gapapaiwa #language-Patwin #language-Molbog #language-Paiwan #language-Pwo Western Karen #language-Fyam #language-Poyanáwa #language-Puyuma #language-Pyen #language-Quapaw #language-Huallaga Huánuco Quechua #language-K'iche' #language-Quechua #language-Lambayeque Quechua #language-Chimborazo Highland Quichua #language-South Bolivian Quechua #language-Quileute #language-Chachapoyas Quechua #language-North Bolivian Quechua #language-Sipacapense #language-Quinault #language-Southern Pastaza Quechua #language-Santiago del Estero Quichua #language-Sacapulteco #language-Tena Lowland Quichua #language-Yauyos Quechua #language-Ayacucho Quechua #language-Cusco Quechua #language-Ambo-Pasco Quechua #language-Cajamarca Quechua #language-Eastern Apurímac Quechua #language-Imbabura Highland Quichua #language-Cajatambo North Lima Quechua #language-North Junín Quechua #language-Napo Lowland Quechua #language-San Martín Quechua #language-Huaylla Wanca Quechua #language-Queyu #language-Northern Pastaza Quichua #language-Corongo Ancash Quechua #language-Huaylas Ancash Quechua #language-Kuman (Russia) #language-Kwalhioqua-Tlatskanai #language-Chiquián Ancash Quechua #language-Northern Conchucos Ancash Quechua #language-Southern Conchucos Ancash Quechua #language-Puno Quechua #language-Qashqa'i #language-Cañar Highland Quichua #language-Southern Qiang #language-Arequipa-La Unión Quechua #language-Jauja Wanca Quechua #language-Quenya #language-Quiripi #language-Dungmali #language-Camling #language-Rasawa #language-Rade #language-Western Meohang #language-Logooli #language-Rabha #language-Ramoaaina #language-Rajasthani #language-Tulu-Bohuai #language-Ralte #language-Canela #language-Riantana #language-Rao #language-Rapanui #language-Rarotongan #language-Tegali #language-Raute #language-Sampang #language-Rawang #language-Rapa #language-Rahambuu #language-Rumai Palaung #language-Barababaraba #language-Réunion Creole French #language-Rerau #language-Rembong #language-Rejang Kayan #language-Kara (Tanzania) #language-Rejang #language-Rendille #language-Remo #language-Rengao #language-Reshe #language-Reyesano #language-Roria #language-Rangkas #language-Romagnol #language-Resígaro #language-Southern Roglai #language-Rohingya #language-Yahang #language-Tarifit #language-Riang Lang #language-Nyaturu #language-Nungu #language-Ribun #language-Ritharrngu #language-Riung #language-Raji #language-Rajbanshi #language-Rikbaktsa #language-Rakahanga-Manihiki #language-Rakhine #language-Marka #language-Rama #language-Rembarrnga #language-Carpathian Romani #language-Angloromani #language-Kalo Finnish Romani #language-Traveller Norwegian #language-Murkim #language-Lomavren #language-Romkun #language-Baltic Romani #language-Roma #language-Balkan Romani #language-Sinte Romani #language-Rempi #language-Caló #language-Domari #language-Tavringer Romani #language-Romanova #language-Welsh Romani #language-Vlax Romani #language-Ruund #language-Ronga #language-Roon #language-Rongpo #language-Nari Nari #language-Rungwa #language-Tae' #language-Rogo #language-Ronji #language-Rombo #language-Northern Roglai #language-Romansh #language-Romblomanon #language-Romany #language-Romanian #language-Rotokas #language-Kriol #language-Runga #language-Repanbitip #language-Rapting #language-Ririo #language-Waima #language-Romano-Serbian #language-Ratahan #language-Rotuman #language-Rathawi #language-Gungu #language-Rusyn #language-Luguru #language-Roviana #language-Rufiji #language-Che #language-Rundi #language-Istro Romanian #language-Macedo-Romanian #language-Megleno Romanian #language-Russian #language-Rutul #language-Lanas Lobu #language-Rawo #language-Rwa #language-Amba (Uganda) #language-Rawa #language-Marwari (India) #language-Northern Amami-Oshima #language-Yaeyama #language-Central Okinawan #language-Buglere #language-Meskwaki #language-Sandawe #language-Sabanê #language-Safaliba #language-Sango #language-Yakut #language-Sahu #language-Sake #language-Samaritan Aramaic #language-Sanskrit #language-Sause #language-Samburu #language-Saraveca #language-Sasak #language-Santali #language-Saleman #language-Saafi-Saafi #language-Sawi #language-Sa #language-Saya #language-Saurashtra #language-Ngambay #language-Simbo #language-Kele (Papua New Guinea) #language-Southern Samo #language-Saliba #language-Chabu #language-Seget #language-Sori-Harengan #language-Seti #language-Safwa #language-Botolan Sambal #language-Sagala #language-Sabüm #language-Sangu (Tanzania) #language-Sileibi #language-Sembakung Murut #language-Subiya #language-Kimki #language-Stod Bhoti #language-Sabine #language-Simba #language-Chut #language-Dongxiang #language-Sanggau #language-Sri Lankan Creole Malay #language-Sadri #language-Shina #language-Sicilian #language-Scots #language-Hyolmo #language-Sa'och #language-North Slavey #language-Shumcho #language-Sheni #language-Sha #language-Toraja-Sa'dan #language-Sassarese Sardinian #language-Surubu #language-Savi #language-Southern Kurdish #language-Suundi #language-Gallurese Sardinian #language-Bukar-Sadung Bidayuh #language-Sherdukpen #language-Semandang #language-Sened #language-Shuadit #language-Sallands #language-Semai #language-Sechelt #language-Sedang #language-Seneca #language-Cebaara Senoufo #language-Sena #language-Seri #language-Sene #language-Sekani #language-Selkup #language-Nanerigé Sénoufo #language-Suarmin #language-Senara Sénoufo #language-Serrano #language-Koyraboro Senni Songhai #language-Sentani #language-Serui-Laut #language-Nyarafolo Senoufo #language-Sewa Bay #language-Secoya #language-Old Irish (to 900) #language-Kipsigis #language-Surigaonon #language-Segai #language-Shughni #language-Suga #language-Singa #language-Singpho #language-Sangisari #language-Samogitian #language-Brokpake #language-Sebat Bet Gurage #language-Sanglechi #language-Sursurunga #language-Shall-Zwall #language-Ninam #language-Sheko #language-Shua #language-Shoshoni #language-Tachelhit #language-Shatt #language-Shilluk #language-Shendu #language-Shan #language-Shanga #language-Shipibo-Conibo #language-Sala #language-Shi #language-Shuswap #language-Shasta #language-Chadian Arabic #language-Shehri #language-Shwai #language-She #language-Tachawit #language-Syenara Senoufo #language-Akkala Sami #language-Sebop #language-Sidamo #language-Simaa #language-Siamou #language-Paasaal #language-Shom Peng #language-Numbami #language-Sikiana #language-Tumulung Sisaala #language-Mende (Papua New Guinea) #language-Sinhala #language-Sikkimese #language-Sonia #language-Siri #language-Siuslaw #language-Sinagen #language-Siwai #language-Sumau #language-Sivandi #language-Siwi #language-Epena #language-Kildin Sami #language-Pite Sami #language-Assangori #language-Kemi Sami #language-Sajalong #language-Mapun #language-Sindarin #language-Xibe #language-Siar-Lak #language-Senhaja De Srair #language-Ter Sami #language-Ume Sami #language-Shawnee #language-Skagit #language-Saek #language-Ma Manda #language-Southern Sierra Miwok #language-Seke (Vanuatu) #language-Sakirabiá #language-Sakalava Malagasy #language-Sikule #language-Sika #language-Seke (Nepal) #language-Kutong #language-Saraiki #language-Maia #language-Sakata #language-Sakao #language-Skou #language-Skepi Creole Dutch #language-Seko Padang #language-Sikaiana #language-Sekar #language-Sáliba #language-Selungai Murut #language-Southern Puget Sound Salish #language-Lower Silesian #language-Salumá #language-Slovak #language-Salt-Yui #language-Pangutaran Sama #language-Salinan #language-Lamaholot #language-Salar #language-Selaru #language-Slovenian #language-Sialum #language-Selayar #language-Ma'ya #language-Southern Sami #language-Simbari #language-Som #language-Northern Sami #language-Simbali #language-Lule Sami #language-Bolinao #language-Central Sama #language-Inari Sami #language-Samoan #language-Samaritan #language-Samo #language-Simeulue #language-Skolt Sami #language-Sumbawa #language-Samba #language-Semnani #language-Shona #language-Sinaugoro #language-Sindhi #language-Bau Bidayuh #language-Noon #language-Sanga (Democratic Republic of Congo) #language-Sensi #language-Soninke #language-Sangil #language-Siona #language-Snohomish #language-Siane #language-Sangu (Gabon) #language-Sihan #language-South West Bay #language-Senggi #language-Sa'ban #language-Selee #language-Sam #language-Saniyo-Hiyewe #language-Kou #language-Sobei #language-So (Democratic Republic of Congo) #language-Songoora #language-Songomeno #language-Sogdian #language-Aka #language-Sokoro #language-Solos #language-Somali #language-Songe #language-Kanasi #language-Somrai #language-Seeku #language-Southern Sotho #language-Southern Thai #language-Sonsorol #language-Sowanda #language-Temi #language-Spanish #language-Sepa (Indonesia) #language-Sapé #language-Saep #language-Sepa (Papua New Guinea) #language-Saponi #language-Selepet #language-Akukem #language-Sanapaná #language-Spokane #language-Supyire Senoufo #language-Saparua #language-Saposa #language-Sapuan #language-South Picene #language-Sabaot #language-Shama-Sambuga #language-Albanian #language-Susquehannock #language-Sorkhei #language-Sou #language-Siculo Arabic #language-Soqotri #language-Squamish #language-Saruga #language-Sora #language-Logudorese Sardinian #language-Sardinian #language-Nafi #language-Sarikoli #language-Siriano #language-Serudung Murut #language-Isirawa #language-Saramaccan #language-Sranan Tongo #language-Campidanese Sardinian #language-Serbian #language-Sirionó #language-Serer #language-Sarsi #language-Suruí #language-Southern Sorsoganon #language-Serua #language-Sera #language-Shahmirzadi #language-Southern Sama #language-Suba-Simbiti #language-Siroi #language-Balangingi #language-Thao #language-Seimat #language-Sausi #language-Sunam #language-Western Sisaala #language-Semnam #language-Sissano #language-Spanish Sign Language #language-So'a #language-Sô #language-Sinasina #language-Susuami #language-Shark Bay #language-Swati #language-Samberigi #language-Saho #language-Sengseng #language-Liana-Seti #language-Seta #language-Trieng #language-Shelta #language-Bulo Stieng #language-Matya Samo #language-Arammba #language-Stellingwerfs #language-Owa #language-Stoney #language-Southeastern Tepehuan #language-Saterfriesisch #language-Straits Salish #language-Shumashti #language-Samtao #language-Silt'e #language-Satawalese #language-Sulka #language-Suku #language-Western Subanon #language-Suena #language-Suganga #language-Suki #language-Shubi #language-Sukuma #language-Sundanese #language-Tirmaga-Chai Suri #language-Mwaghavul #language-Susu #language-Subtiaba #language-Puroik #language-Sumbwa #language-Sumerian #language-Suyá #language-Sunwar #language-Svan #language-Ulau-Suain #language-Vincentian Creole English #language-Serili #language-Savosavo #language-Maore Comorian #language-Congo Swahili #language-Swedish #language-Sere #language-Swabian #language-Swahili (individual language) #language-Sui #language-Sira #language-Samosa #language-Sawknah #language-Shanenawa #language-Suau #language-Sharwa #language-Seluwasan #language-Sawila #language-Suwawa #language-Shekhawati #language-Sowa #language-Suruahá #language-Sarua #language-Suba #language-Sighu #language-Shuhi #language-Samre #language-Sangir #language-Saaroa #language-Upper Saxon #language-Siang #language-Central Subanen #language-Classical Syriac #language-Seki #language-Sukur #language-Sylheti #language-Maya Samo #language-Senaya #language-Syriac #language-Sinyar #language-Kagate #language-Semelai #language-Ngalum #language-Semaq Beri #language-szd #language-Seze #language-Sengele #language-Silesian #language-Sula #language-Suabo #language-Isu (Fako Division) #language-Sawai #language-Lower Tanana #language-Tabassaran #language-Lowland Tarahumara #language-Tause #language-Tariana #language-Tapirapé #language-Tagoi #language-Tahitian #language-Eastern Tamang #language-Tal #language-Tamil #language-Tangale #language-Yami #language-Taabwa #language-Tamasheq #language-Central Tarahumara #language-Tay Boi #language-Tatar #language-Upper Tanana #language-Tatuyo #language-Atayal #language-Tocho #language-Aikanã #language-Takia #language-Kaki Ae #language-Tanimbili #language-Mandara #language-North Tairora #language-Dharawal #language-Gaam #language-Tiang #language-Calamian Tagbanwa #language-Tboli #language-Tagbu #language-Barro Negro Tunebo #language-Tawala #language-Taworta #language-Tumtum #language-Tanguat #language-Tembo (Kitembo) #language-Tubar #language-Tobo #language-Tagbanwa #language-Kapin #language-Tabaru #language-Ditammari #language-Ticuna #language-Tanacross #language-Datooga #language-Tafi #language-Southern Tutchone #language-Malinaltepec Me'phaa #language-Tamagario #language-Wára #language-Tchitchege #language-Taman (Myanmar) #language-Tanahmerah #language-Tichurong #language-Kaiy #language-Torres Strait Creole #language-T'en #language-Toda #language-Tulu #language-Thado Chin #language-Tagdal #language-Emberá-Tadó #language-Tai Nüa #language-Tiranige Diga Dogon #language-Talieng #language-Western Tamang #language-Thulung #language-Tomadino #language-Tajio #language-Tambas #language-Sur #language-Taruma #language-Tondano #language-Teme #language-Todrah #language-Doutai #language-Tetun Dili #language-Toro #language-Tandroy-Mahafaly Malagasy #language-Tadyawan #language-Temiar #language-Terik #language-Tepo Krumen #language-Huehuetla Tepehua #language-Teke-Tege #language-Tehuelche #language-Torricelli #language-Ibali Teke #language-Telugu #language-Timne #language-Tama (Colombia) #language-Teso #language-Tepecano #language-Temein #language-Tereno #language-Tengger #language-Tetum #language-Soo #language-Teor #language-Tewa (USA) #language-Tennet #language-Tulishi #language-Tanaina #language-Teribe #language-Ternate #language-Tobilung #language-Tigak #language-Ciwogai #language-Tobagonian Creole English #language-Lawunuia #language-Tagin #language-Tajik #language-Tagalog #language-Tandaganon #language-Sudest #language-Tangoa #language-Tring #language-Tareng #language-Nume #language-Central Tagbanwa #language-Tanggu #language-Tagwana Senoufo #language-Tagish #language-Togoyo #language-Thai #language-Kuuk Thaayorre #language-Chitwania Tharu #language-Thangmi #language-Tharaka #language-Dangaura Tharu #language-Aheu #language-Thompson #language-Kochila Tharu #language-Rana Tharu #language-Thakali #language-Tahltan #language-Tahaggart Tamahaq #language-Tayart Tamajeq #language-Tidikelt Tamazight #language-Tira #language-Tifal #language-Tigre #language-Timugon Murut #language-Tiene #language-Tilung #language-Tikar #language-Tillamook #language-Timbe #language-Tindi #language-Teop #language-Trimuris #language-Tiéfo #language-Tigrinya #language-Masadiit Itneg #language-Tinigua #language-Adasen #language-Tiv #language-Tiwi #language-Southern Tiwa #language-Tiruray #language-Tai Hongjin #language-Tunjung #language-Northern Tujia #language-Timucua #language-Southern Tujia #language-Tjurruru #language-Djabwurrung #language-Tukudede #language-Takwane #language-Tesaka Malagasy #language-Tokelau #language-Takelma #language-Toku-No-Shima #language-Tikopia #language-Tee #language-Tsakhur #language-Takestani #language-Upper Necaxa Totonac #language-Teanu #language-Tobelo #language-Yecuatla Totonac #language-Talaud #language-Telefol #language-Tofanma #language-Klingon #language-Tlingit #language-Talinga-Bwisi #language-Taloki #language-Tetela #language-Tolomako #language-Talodi #language-Filomena Mata-Coahuitlán Totonac #language-Tai Loi #language-Talise #language-Tambotalo #language-Sou Nama #language-Tulehu #language-Taliabu #language-Khehek #language-Talysh #language-Tama (Chad) #language-Katbol #language-Tumak #language-Haruai #language-Toba-Maskoy #language-Tutuba #language-Samarokena #language-Tai Thanh #language-Taman (Indonesia) #language-Temoq #language-Tumleo #language-Jewish Babylonian Aramaic (ca. 200-1200 CE) #language-Tima #language-Tasmate #language-Iau #language-Temuan #language-Tami #language-Tamanaku #language-Tacana #language-Tanimuca-Retuarã #language-Maiani #language-Tandia #language-Kwamera #language-Lenakel #language-Tabla #language-North Tanna #language-Toromono #language-Whitesands #language-Taino #language-Ménik #language-Tontemboan #language-Tonsawang #language-Tanema #language-Ten'edn #language-Toba #language-Coyutla Totonac #language-Gizrra #language-Tonga (Nyasa) #language-Gitonga #language-Tonga (Zambia) #language-Tojolabal #language-Tolowa #language-Tombulu #language-Tonga (Tonga Islands) #language-Xicotepec De Juárez Totonac #language-Papantla Totonac #language-Highland Totonac #language-Jemez #language-Tobian #language-Taupota #language-Azoyú Me'phaa #language-Tippera #language-Tarpia #language-Kula #language-Tok Pisin #language-Tapieté #language-Tlacoapa Me'phaa #language-Tampulma #language-Tupinambá #language-Pisaflores Tepehua #language-Tuparí #language-Tlachichilco Tepehua #language-Tampuan #language-tpw #language-Acatepec Me'phaa #language-Trumai #language-Tinputz #language-Tembé #language-Lehali #language-Toaripi #language-Tomoip #language-Tunni #language-Touo #language-Tonkawa #language-Tirahi #language-Copala Triqui #language-Turi #language-Trinidadian Creole English #language-Lishán Didán #language-Turaka #language-Trió #language-Tregami #language-Trinitario #language-Tarao Naga #language-Kok Borok #language-San Martín Itunyoso Triqui #language-Taushiro #language-Chicahuaxtla Triqui #language-Tunggare #language-Turoyo #language-Sediq #language-Torwali #language-Tringgus-Sembaan Bidayuh #language-Torá #language-Tsaangi #language-Tsamai #language-Tswa #language-Tsakonian #language-Tausug #language-Tsuvan #language-Tsimshian #language-Tshangla #language-Tseku #language-Ts'ün-Lao #language-Tswana #language-Tsonga #language-Akei #language-Tsou #language-Tsogo #language-Tsishingini #language-Mubami #language-Purepecha #language-Tutelo #language-Tektiteko #language-Tauade #language-Bwanabwana #language-Tuotomb #language-Tutong #language-Tobati #language-Tooro #language-Totoro #language-Northern Tutchone #language-Towei #language-Tawallammat Tamajaq #language-Tera #language-Northeastern Thai #language-Muslim Tat #language-Torau #language-Titan #language-Long Wat #language-Sikaritai #language-Wiarumus #language-Tübatulabal #language-Mutu #language-Tuxá #language-Tuyuca #language-Central Tunebo #language-Tunia #language-Taulil #language-Tupuri #language-Turkmen #language-Tula #language-Tumbuka #language-Tunica #language-Tucano #language-Tedaga #language-Turkish #language-Tuscarora #language-Tututni #language-Turkana #language-Tuxináwa #language-Tugen #language-Turka #language-Vaghua #language-Tsuvadi #language-Te'un #language-Southeast Ambrym #language-Tuvalu #language-Tela-Masbuar #language-Tidore #language-Taveta #language-Tunen #language-Sedoa #language-Timor Pidgin #language-Twana #language-Western Tawbuid #language-Twents #language-Tewa (Indonesia) #language-Northern Tiwa #language-Tereweng #language-Tai Dón #language-Twi #language-Tawang Monpa #language-Ere #language-Tasawaq #language-Turiwára #language-Termanu #language-Tuwari #language-Tewe #language-Tawoyan #language-Tombonuo #language-Tokharian B #language-Tsetsaut #language-Totoli #language-Tangut #language-Thracian #language-Ikpeng #language-Tomini #language-West Tarangan #language-Toto #language-Tonsea #language-Citak #language-Kayapó #language-Tatana #language-Tanosy Malagasy #language-Tauya #language-Kyanga #language-Teke-Tsaayi #language-Tai Do #language-Kombai #language-Thaypan #language-Tai Daeng #language-Kua #language-Tuvinian #language-Teke-Tyee #language-Tày #language-Tzeltal #language-Tz'utujil #language-Talossan #language-Central Atlas Tamazight #language-Tugun #language-Tzotzil #language-Tabriak #language-Uamué #language-Kuan #language-Tairuma #language-Ubang #language-Ubi #language-Buhi'non Bikol #language-Ubir #language-Umbu-Ungu #language-Ubykh #language-Uda #language-Udihe #language-Udi #language-Ujir #language-Wuzlam #language-Udmurt #language-Uduk #language-Kioko #language-Ufim #language-Ugaritic #language-Ughele #language-Ugong #language-Uhami #language-Damal #language-Uighur #language-Iyive #language-Tanjijili #language-Kaburi #language-Ukuriguma #language-Ukhwejo #language-Kui (India) #language-Ukpe-Bayobiri #language-Ukwa #language-Ukrainian #language-Ukue #language-Kuku #language-Fungwa #language-Ulch #language-Lule #language-Usku #language-Ulithian #language-Meriam Mir #language-Ulumanda' #language-Unserdeutsch #language-Uma' Lung #language-Ulwa #language-Umatilla #language-Umbundu #language-Marrucinian #language-Morrobalama #language-Umon #language-Umotína #language-Umpila #language-Umbugarla #language-Pendau #language-Munsee #language-North Watut #language-Undetermined #language-Uneme #language-Ngarinyin #language-Enawené-Nawé #language-Unami #language-Kurnai #language-Mundari #language-Unde Kaili #language-Uripiv-Wala-Rano-Atchin #language-Urarina #language-Urubú-Kaapor #language-Urningangg #language-Urdu #language-Uru #language-Uradhi #language-Urigina #language-Urhobo #language-Urim #language-Urak Lawoi' #language-Uruangnirin #language-Uru-Pa-In #language-Lehalurup #language-Urat #language-Urumi #language-Uruava #language-Sop #language-Urimo #language-Orya #language-Uru-Eu-Wau-Wau #language-Usarufa #language-Ushojo #language-Usaghade #language-Uspanteco #language-Uya #language-Otank #language-Ute-Southern Paiute #language-Amba (Solomon Islands) #language-Etulo #language-Utu #language-Urum #language-Ura (Vanuatu) #language-U #language-West Uvean #language-Uri #language-Lote #language-Kuku-Uwanh #language-Doko-Uyanga #language-Northern Uzbek #language-Southern Uzbek #language-Vaagri Booli #language-Vale #language-Vafsi #language-Vagla #language-Varhadi-Nagpuri #language-Vai #language-Sekele #language-Vehes #language-Vanimo #language-Valman #language-Vao #language-Huarijio #language-Vasavi #language-Vanuma #language-Wayu #language-Southeast Babar #language-Southwestern Bontok #language-Venetian #language-Veddah #language-Veluws #language-Vemgo-Mabas #language-Venda #language-Ventureño #language-Veps #language-Mom Jango #language-Virgin Islands Creole English #language-Vietnamese #language-Vili #language-Viemo #language-Vilela #language-Vinza #language-Viti #language-Iduna #language-Kariyarra #language-Kujarge #language-Kulisusu #language-Kamakan #language-Kodeoha #language-Korlai Creole Portuguese #language-Kurrama #language-Valpei #language-Vlaams #language-Martuyhunira #language-Barbaram #language-East Masela #language-Mainfränkisch #language-Lungalunga #language-Miwa #language-Makhuwa-Shirima #language-Malgana #language-Mitlatongo Mixtec #language-Soyaltepec Mazatec #language-Marenje #language-Makhuwa #language-Ayautla Mazatec #language-Mazatlán Mazatec #language-Vano #language-Vinmavis #language-Vunapu #language-Volapük #language-Voro #language-Votic #language-Vera'a #language-Võro #language-Varisi #language-Burmbar #language-Vunjo #language-Vute #language-Walla Walla #language-Wab #language-Wasco-Wishram #language-Wamesa #language-Walser #language-Watubela #language-Waffa #language-Wolaytta #language-Wampanoag #language-Wan #language-Wappo #language-Wapishana #language-Wagiman #language-Waray (Philippines) #language-Washo #language-Kaninuwa #language-Waurá #language-Waka #language-Waiwai #language-Watam #language-Wayana #language-Wampur #language-Warao #language-Wabo #language-Waritai #language-Wara #language-Wanda #language-Vwanji #language-Alagwa #language-Waigali #language-Wakhi #language-Wa #language-Warlpiri #language-Wagdi #language-Warnman #language-Wajarri #language-Woi #language-Yanomámi #language-Waci Gbe #language-Wandji #language-Wadaginam #language-Wadjiginy #language-Wewaw #language-Wedau #language-Wergaia #language-Weh #language-Kiunum #language-Wemale #language-Westphalien #language-Weri #language-Cameroon Pidgin #language-Perai #language-Wejewa #language-Yafi #language-Wagaya #language-Wagawaga #language-Wangkangurru #language-Wahgi #language-Waigeo #language-Wirangu #language-Warrgamay #language-Sou Upaa #language-Wahau Kenyah #language-Southern Toussian #language-Wichita #language-Wik Ngathan #language-Wik-Me'anha #language-Minidien #language-Wikalkan #language-Wilawila #language-Wik-Mungkan #language-Ho-Chunk #language-Wiraféd #language-Wiru #language-Vitu #language-Wiyot #language-Waja #language-Warji #language-Kw'adza #language-Wakawaka #language-Walio #language-Mwali Comorian #language-Wolane #language-Kunbarlang #language-Welaun #language-Waioli #language-Wailaki #language-Wali (Sudan) #language-Middle Welsh #language-Walloon #language-Wolio #language-Wailapa #language-Wallisian #language-Wuliwuli #language-Wichí Lhamtés Vejoz #language-Walak #language-Waling #language-Wambaya #language-Wamas #language-Mamaindé #language-Wambule #language-Waima'a #language-Wamin #language-Wom (Papua New Guinea) #language-Wambon #language-Walmajarri #language-Mwani #language-Womo #language-Mokati #language-Wantoat #language-Wandarang #language-Waneci #language-Ndzwani Comorian #language-Wanukaka #language-Wano #language-Wanap #language-Usan #language-Wintu #language-Wanyi #language-Kuwema #language-Wè Northern #language-Wogeo #language-Wolani #language-Woleaian #language-Gambian Wolof #language-Wogamusin #language-Kamang #language-Longto #language-Wolof #language-Wom (Nigeria) #language-Wongo #language-Hanga Hundi #language-Wawonii #language-Maco #language-Waluwarra #language-Warungu #language-Wiradjuri #language-Wariyangga #language-Garrwa #language-Warlmanpa #language-Warumungu #language-Warnang #language-Worrorra #language-Waropen #language-Wardaman #language-Waris #language-Waru #language-Waruna #language-Gugu Warra #language-Merwari #language-Waray (Australia) #language-Warembori #language-Adilabad Gondi #language-Wusi #language-Waskia #language-Owenia #language-Wotapuri-Katarqalai #language-Watiwa #language-Wathawurrung #language-Berta #language-Mewati #language-Wotu #language-Wunambal #language-Wutunhua #language-Silimo #language-Wumbvu #language-Bungu #language-Wutung #language-Wu Chinese #language-Wuvulu-Aua #language-Wulna #language-Wauyai #language-Waama #language-Wetamut #language-Warrwa #language-Wawa #language-Waxianghua #language-Wangaaybuwan-Ngiyambaa #language-Woiwurrung #language-Wymysorys #language-Wayoró #language-Western Fijian #language-Andalusian Arabic #language-Sambe #language-Kachari #language-Adai #language-Kaimbé #language-Ararandewára #language-Kalmyk #language-ǀXam #language-Xamtanga #language-Apalachee #language-Aquitanian #language-Karami #language-Kamas #language-Katawixi #language-Kauwera #language-Xavánte #language-Kawaiisu #language-Bactrian #language-Bigambal #language-Bunganditj #language-Kombio #language-Middle Breton #language-Kenaboi #language-Kambera #language-Kambiwá #language-Celtiberian #language-Cisalpine Gaulish #language-Chemakum #language-Classical Armenian #language-Comecrudo #language-Cotoname #language-Chorasmian #language-Carian #language-Classical Tibetan #language-Coahuilteco #language-Cayuse #language-Dacian #language-Dharuk #language-Edomite #language-Malayic Dayak #language-Eblan #language-Hdi #language-ǁXegwi #language-Kelo #language-Kembayan #language-Epi-Olmec #language-Xerénte #language-Kesawai #language-Xetá #language-Keoru-Ahia #language-Faliscan #language-Galatian #language-Gudang #language-Gabrielino-Fernandeño #language-Dharumbal #language-Hadrami #language-Xhosa #language-Hattic #language-Hurrian #language-Iberian #language-Xiri #language-Illyrian #language-Xinca #language-Xiriâna #language-Xipaya #language-Jaitmatang #language-Kalkoti #language-Northern Nago #language-Kereho #language-Kagoro #language-Mainstream Kenyah #language-Kayan River Kayan #language-Koroni #language-Xakriabá #language-Kumbewaha #language-Kgalagadi #language-Uma' Lasan #language-Kamula #language-Loup B #language-Lycian #language-Lydian #language-Ligurian (Ancient) #language-Loup A #language-Lepontic #language-Lusitanian #language-Cuneiform Luwian #language-Elymian #language-Mbonga #language-Mingrelian #language-Mengaka #language-Kugu-Muminh #language-Ancient Macedonian #language-Manado Malay #language-Manichaean Middle Persian #language-Meroitic #language-Matbat #language-Kamu #language-Antankarana Malagasy #language-Tsimihety Malagasy #language-Mayaguduna #language-Mori Bawah #language-Kanakanabu #language-Middle Mongolian #language-Ngarigu #language-Ngumbarl #language-Northern Kankanay #language-Anglo-Norman #language-Kanashi #language-Nyiyaparli #language-Kenzi #language-Kokoda #language-Soga #language-Xokleng #language-Komo (Sudan) #language-Konkomba #language-Xukurú #language-Kopar #language-Korubo #language-Kowaki #language-Northeastern Tasmanian #language-Oyster Bay Tasmanian #language-Liberia Kpelle #language-Southeast Tasmanian #language-Phrygian #language-Kulina Pano #language-Pumpokol #language-Mohegan-Pequot #language-Parthian #language-Punthamara #language-Punic #language-Northern Tasmanian #language-Puyo #language-Qatabanian #language-Krahô #language-Kreye #language-Krikati-Timbira #language-Arin #language-Aranama-Tamique #language-Marriammu #language-Karawa #language-Sabaean #language-Sambal #language-Scythian #language-Sempan #language-Shamang #language-Sio #language-South Slavey #language-Kasem #language-Silopi #language-Sherpa #language-xss #language-Sanumá #language-Sudovian #language-Saisiyat #language-Alcozauca Mixtec #language-Katcha-Kadugli-Miri #language-Diuxi-Tilantongo Mixtec #language-Ketengban #language-Transalpine Gaulish #language-Magdalena Peñasco Mixtec #language-Tokharian A #language-Thawa #language-Alu Kurumba #language-Betta Kurumba #language-Kunigami #language-Ngunawal #language-Umbrian #language-Kuo #language-Upper Umpqua #language-Urartian #language-Kuthant #language-Kxoe #language-Kamviri #language-Vandalic #language-Vestinian #language-Kwaza #language-Woccon #language-Wadi Wadi #language-Kwegu #language-Wangkumara #language-Written Oirat #language-Wemba Wemba #language-Ke'o #language-Tambora #language-Mayi-Yapi #language-Mayi-Kulan #language-Mayi-Thakurti #language-Yorta Yorta #language-Zhang-Zhung #language-Ancient Zapotec #language-Yaminahua #language-Yuhup #language-Yagua #language-Pumé #language-Yaka (Democratic Republic of Congo) #language-Yámana #language-Yazgulyam #language-Yagnobi #language-Banda-Yangere #language-Yakama #language-Yalunka #language-Yamba #language-Mayangna #language-Yao #language-Yapese #language-Yaqui #language-Yabarana #language-Nugunu (Cameroon) #language-Yambeta #language-Yuwana #language-Yangben #language-Yawalapití #language-Agwagwune #language-Lokaa #language-Yala #language-Yemba #language-West Yugur #language-Yakha #language-Yamphu #language-Hasha #language-Yukuben #language-Yaben #language-Yabaâna #language-Yabong #language-Yawiyo #language-Yaweyuha #language-Lolopo #language-Yucuna #language-Yanda #language-Eastern Yiddish #language-Yidgha #language-Yoidik #language-Ravula #language-Yeniche #language-Yimas #language-Tarok #language-Nyankpa #language-Yetfa #language-Yapunda #language-Yeyi #language-Malyangapa #language-Yagaria #language-Yagwoia #language-Baha Buyang #language-Hlepho Phowa #language-Yinggarda #language-Wusa Nasu #language-Western Yiddish #language-Yidiny #language-Yindjibarndi #language-Yimchungru Naga #language-Yis #language-Axi Yi #language-Yakan #language-Northern Yukaghir #language-Yoke #language-Kap #language-Yasa #language-Yekora #language-Kathu #language-Yakoma #language-Yele #language-Yelogu #language-Angguruk Yali #language-Yil #language-Langnian Buyang #language-Yalarnnga #language-Aribwaung #language-Nyâlayu #language-Yambes #language-Southern Muji #language-Yameo #language-Iamalele #language-Maay #language-Yangum Mon #language-Yamap #language-Yandruwandha #language-Naukan Yupik #language-Yangulam #language-Yana #language-Yendang #language-Yansi #language-Yahuna #language-Yoba #language-Yogad #language-Yonaguni #language-Yokuts #language-Yola #language-Yombe #language-Yongkom #language-Yoruba #language-Yotti #language-Yoron #language-Yoy #language-Phola #language-Phupha #language-Phuza #language-Yerakai #language-Yareba #language-Yaouré #language-Nenets #language-Nhengatu #language-Yerong #language-Yarawata #language-Sani #language-Sirenik Yupik #language-Yessan-Mayo #language-Yucateco #language-Yugambal #language-Yuchi #language-Yue Chinese #language-Havasupai-Walapai-Yavapai #language-Yug #language-Yurutí #language-Karkar-Yuri #language-Yuki #language-Yulu #language-Quechan #language-Bena (Nigeria) #language-Yukpa #language-Yurok #language-Yopno #language-Yau (Morobe Province) #language-Southern Yukaghir #language-East Yugur #language-Yuracare #language-Yawa #language-Yavitero #language-Kalou #language-Yawanawa #language-Wuding-Luquan Yi #language-Yawuru #language-Xishanba Lalo #language-Yawarawarga #language-Yagara #language-Yardliyawarra #language-Yir Yoront #language-Yau (Sandaun Province) #language-E'ma Buyang #language-Sierra de Juárez Zapotec #language-Western Tlacolula Valley Zapotec #language-Ocotlán Zapotec #language-Cajonos Zapotec #language-Yareni Zapotec #language-Ayoquesco Zapotec #language-Zaghawa #language-Isthmus Zapotec #language-Zaramo #language-Zanaki #language-Zauzou #language-Ozolotepec Zapotec #language-Zapotec #language-Aloápam Zapotec #language-Rincón Zapotec #language-Yatzachi Zapotec #language-Mitla Zapotec #language-Zayse-Zergulla #language-Zari #language-Balaibalan #language-Central Berawan #language-East Berawan #language-Blissymbols #language-West Berawan #language-Coatecas Altas Zapotec #language-Central Hongshuihe Zhuang #language-Ngazidja Comorian #language-Zeeuws #language-Zenag #language-Zenaga #language-Kinga #language-Guibei Zhuang #language-Standard Moroccan Tamazight #language-Guibian Zhuang #language-Magori #language-Zhuang #language-Zhaba #language-Dai Zhuang #language-Zhire #language-Nong Zhuang #language-Chinese #language-Zhoa #language-Zia #language-Zimakani #language-Mesme #language-Zinza #language-Zigula #language-Zizilivakan #language-Kaimbulawa #language-zkb #language-Kadu #language-Koguryo #language-Karankawa #language-Kott #language-São Paulo Kaingáng #language-Zakhring #language-Kitan #language-Kaurna #language-Liujiang Zhuang #language-Malay (individual language) #language-Lianshan Zhuang #language-Manda (Australia) #language-Zimba #language-Margany #language-Maridan #language-Mangerr #language-Marti Ke #language-Maridjabin #language-Mandandanyi #language-Matngala #language-Marimanindji #language-Mbangwe #language-Molo #language-Mpuono #language-Maranunggu #language-Maringarr #language-Muruwari #language-Mbariman-Gudhinma #language-Mbo (Democratic Republic of Congo) #language-Bomitaba #language-Mariyedi #language-Mbandja #language-Zande (individual language) #language-Mang #language-Mangas #language-Copainalá Zoque #language-Chimalapa Zoque #language-Zou #language-Tabasco Zoque #language-Rayón Zoque #language-Francisco León Zoque #language-Choapan Zapotec #language-San Pedro Quiatoni Zapotec #language-Santa María Quiegolani Zapotec #language-Lachixío Zapotec #language-Mixtepec Zapotec #language-Santa Inés Yatzechi Zapotec #language-Amatlán Zapotec #language-Zoogocho Zapotec #language-Santiago Xanica Zapotec #language-Coatlán Zapotec #language-Yalálag Zapotec #language-Chichicapan Zapotec #language-Zaniza Zapotec #language-San Baltazar Loxicha Zapotec #language-Texmelucan Zapotec #language-Qiubei Zhuang #language-Záparo #language-Zarphatic #language-Mairasi #language-Sarasira #language-Standard Malay #language-Sukurum #language-Xanaguía Zapotec #language-Loxicha Zapotec #language-Quioquitani-Quierí Zapotec #language-Tilquiapan Zapotec #language-Güilá Zapotec #language-Zeem #language-Tokano #language-Zulu #language-Kumzari #language-Zuni #language-Zay #language-Yongbei Zhuang #language-Yang Zhuang #language-Youjiang Zhuang #language-Yongnan Zhuang #language-Zuojiang Zhuang #license-cc0-1.0 #lexicons #region-us
|
# PanLex
January 1, 2024 version of PanLex Language Vocabulary with 24,650,274 rows covering 6,152 languages.
# Columns
- 'vocab': contains the text entry.
- '639-3': contains the ISO 639-3 languages tags to allow users to filter on the language(s) of their choice.
- '639-3_english_name': the English language name associated to the code ISO 639-3.
- 'var_code': contains a code to differentiate language variants. In practice, this is the code 639-3 + a number. If 000, it corresponds to the classic 639-3, if the number is different, it corresponds to a variant (e.g. 'fra-006' corresponds to Québécois according to URL This is a code created by Panlex, and therefore does not follow an international standard.
- 'english_name_var': the English language name associated to the 'var_code'.
# Source
Readers are invited to consult the URL page, which summarizes all the sources.
> Kamholz, David, Jonathan Pool, and Susan M. Colowick. 2014.
PanLex: Building a Resource for Panlingual Lexical Translation.
Proceedings of the 9th International Conference on Language Resources and Evaluation (LREC 2014).
URL | [
"# PanLex\n\nJanuary 1, 2024 version of PanLex Language Vocabulary with 24,650,274 rows covering 6,152 languages.",
"# Columns\n- 'vocab': contains the text entry. \n- '639-3': contains the ISO 639-3 languages tags to allow users to filter on the language(s) of their choice.\n- '639-3_english_name': the English language name associated to the code ISO 639-3. \n- 'var_code': contains a code to differentiate language variants. In practice, this is the code 639-3 + a number. If 000, it corresponds to the classic 639-3, if the number is different, it corresponds to a variant (e.g. 'fra-006' corresponds to Québécois according to URL This is a code created by Panlex, and therefore does not follow an international standard. \n- 'english_name_var': the English language name associated to the 'var_code'.",
"# Source\nReaders are invited to consult the URL page, which summarizes all the sources.\n\n> Kamholz, David, Jonathan Pool, and Susan M. Colowick. 2014. \n PanLex: Building a Resource for Panlingual Lexical Translation. \n Proceedings of the 9th International Conference on Language Resources and Evaluation (LREC 2014). \n URL"
] | [
"TAGS\n#size_categories-10M<n<100M #language-Ghotuo #language-Alumu-Tesu #language-Ari #language-Amal #language-Arbëreshë Albanian #language-Ambrak #language-Arifama-Miniafia #language-Ankave #language-Afade #language-Anambé #language-Algerian Saharan Arabic #language-Pará Arára #language-Eastern Abnaki #language-Afar #language-Aasáx #language-Arvanitika Albanian #language-Abau #language-Solong #language-Mandobo Atas #language-Abé #language-Bankon #language-Western Abnaki #language-Abai Sungai #language-Abaga #language-Abidji #language-Aka-Bea #language-Abkhazian #language-Lampung Nyo #language-Abanyom #language-Abua #language-Abon #language-Abaza #language-Abron #language-Ambonese Malay #language-Ambulas #language-Abure #language-Baharna Arabic #language-Pal #language-Inabaknon #language-Aneme Wake #language-Abui #language-Achagua #language-Gikyode #language-Achinese #language-Saint Lucian Creole French #language-Acoli #language-Aka-Cari #language-Aka-Kora #language-Akar-Bale #language-Mesopotamian Arabic #language-Achang #language-Eastern Acipa #language-Ta'izzi-Adeni Arabic #language-Achi #language-Acroá #language-Achterhoeks #language-Achuar-Shiwiar #language-Achumawi #language-Hijazi Arabic #language-Omani Arabic #language-Cypriot Arabic #language-Adangme #language-Lidzonka #language-Adele #language-Andegerebinha #language-Adhola #language-Adi #language-Adioukrou #language-Galo #language-Adang #language-Abu #language-Adangbe #language-Adonara #language-Adnyamathanha #language-Amundava #language-Amdo Tibetan #language-Adyghe #language-Adzera #language-Tunisian Arabic #language-Saidi Arabic #language-Northeast Pashai #language-Ambele #language-Arem #language-Eastern Arrernte #language-Alsea #language-Ambakich #language-Amele #language-Aeka #language-Gulf Arabic #language-Andai #language-Putukwam #language-Akrukay #language-Nanubae #language-Defaka #language-Eloyi #language-Tapei #language-Afrikaans #language-Afitti #language-Awutu #language-Obokuitai #language-Legbo #language-Agatu #language-Agarabi #language-Angal #language-Arguni #language-Angor #language-Ngelima #language-Argobba #language-Fembe #language-Angaataha #language-Agutaynen #language-Tainae #language-Aghem #language-Aguaruna #language-Esimbi #language-Central Cagayan Agta #language-Aguacateco #language-Remontado Dumagat #language-Kahua #language-Aghul #language-Ahanta #language-Axamb #language-Qimant #language-Aghu #language-Tiagbamrin Aizi #language-Akha #language-Igo #language-Ahom #language-Ahirani #language-Ashe #language-Ahtena #language-Arosi #language-Ainu (China) #language-Alngith #language-Amara #language-Agi #language-Ai-Cham #language-Assyrian Neo-Aramaic #language-Ake #language-Aimele #language-Aimol #language-Ainu (Japan) #language-Aiton #language-Airoran #language-Arikem #language-Aari #language-Aighon #language-Aja (South Sudan) #language-Aja (Benin) #language-Ajië #language-South Levantine Arabic #language-Amri Karbi #language-Akan #language-Batak Angkola #language-Mpur #language-Ukpet-Ehom #language-Akawaio #language-Akpa #language-Anakalangu #language-Angal Heneng #language-Aiome #language-Aka-Jeru #language-Akkadian #language-Aklanon #language-Aka-Bo #language-Akurio #language-Siwu #language-Ak #language-Araki #language-Akaselem #language-Akum #language-Akhvakh #language-Akwa #language-Aka-Kede #language-Aka-Kol #language-Alabama #language-Alago #language-Qawasqar #language-Alladian #language-Aleut #language-Alege #language-Alawa #language-Amaimon #language-Alangan #language-Alak #language-Allar #language-Amblong #language-Gheg Albanian #language-Larike-Wakasihu #language-Alune #language-Algonquin #language-Alutor #language-Tosk Albanian #language-Southern Altai #language-'Are'are #language-Alaba-K’abeena #language-Amol #language-Alyawarr #language-Alur #language-Amanayé #language-Amahuaca #language-Yanesha' #language-Hamer-Banna #language-Amurdak #language-Amharic #language-Amis #language-Amdang #language-Ambai #language-War-Jaintia #language-Ama (Papua New Guinea) #language-Amanab #language-Amo #language-Alamblak #language-Amahai #language-Amarakaeri #language-Southern Amami-Oshima #language-Amto #language-Guerrero Amuzgo #language-Ambelau #language-Western Neo-Aramaic #language-Anmatyerre #language-Ami #language-Atampaya #language-Andaqui #language-Andoa #language-Ngas #language-Ansus #language-Xârâcùù #language-Animere #language-Old English (ca. 450-1100) #language-Nend #language-Andi #language-Anor #language-Goemai #language-Anal #language-Obolo #language-Andoque #language-Angika #language-Jarawa (India) #language-Antakarinya #language-Anuak #language-Denya #language-Anaang #language-Andra-Hus #language-Anyin #language-Anem #language-Angolar #language-Pemon #language-Andarum #language-Bragat #language-Angoram #language-Anindilyakwa #language-Mufian #language-Alor #language-Ömie #language-Bumbita Arapesh #language-Aore #language-Taikat #language-Atong (India) #language-A'ou #language-Atorada #language-Uab Meto #language-Sa'a #language-Levantine Arabic #language-Sudanese Arabic #language-Bukiyip #language-Athpariya #language-Apiaká #language-Jicarilla Apache #language-Kiowa Apache #language-Lipan Apache #language-Mescalero-Chiricahua Apache #language-Apinayé #language-Ambul #language-Apma #language-A-Pucikwar #language-Arop-Lokep #language-Apatani #language-Apurinã #language-Western Apache #language-Aputai #language-Apalaí #language-Safeyoka #language-Archi #language-Ampari Dogon #language-Arigidi #language-Atohwaim #language-Atakapa #language-Angaité #language-Akuntsu #language-Arabic #language-Standard Arabic #language-Official Aramaic (700-300 BCE) #language-Arabana #language-Western Arrarnta #language-Aragonese #language-Arhuaco #language-Arikara #language-Arikapú #language-Arabela #language-Mapudungun #language-Araona #language-Arapaho #language-Algerian Arabic #language-Karo (Brazil) #language-Najdi Arabic #language-Arbore #language-Arawak #language-Aruá (Rodonia State) #language-Moroccan Arabic #language-Egyptian Arabic #language-Asu (Tanzania) #language-Assiniboine #language-Casuarina Coast Asmat #language-American Sign Language #language-Cishingini #language-Abishira #language-Buruwai #language-Sari #language-Ashkun #language-Asilulu #language-Assamese #language-Xingú Asuriní #language-Dano #language-Asuri #language-Ipulo #language-Asturian #language-Tocantins Asurini #language-Asoa #language-Muratayak #language-Yaosakor Asmat #language-As #language-Pele-Ata #language-Zaiwa #language-Atsahuaca #language-Ata Manobo #language-Atemble #language-Attié #language-Atikamekw #language-Ati #language-Ata #language-Atong (Cameroon) #language-Aralle-Tabulahan #language-Waimiri-Atroari #language-Gros Ventre #language-Pamplona Atta #language-Northern Altai #language-Atsugewi #language-Arutani #language-Aneityum #language-Arta #language-Asumboa #language-Waorani #language-Anuta #language-Aushi #language-Anuki #language-Awjilah #language-Heyo #language-Aulua #language-Asu (Nigeria) #language-Molmo One #language-Makayam #language-Austral #language-Auye #language-Awyi #language-Awiyaana #language-Uzbeki Arabic #language-Avaric #language-Avau #language-Alviri-Vidari #language-Avestan #language-Avikam #language-Kotava #language-Angkamuthi #language-Avatime #language-Au #language-Avokaya #language-Avá-Canoeiro #language-Awadhi #language-Awa (Papua New Guinea) #language-Cicipu #language-Awetí #language-Anguthimri #language-Awbono #language-Aekyom #language-Awabakal #language-Arawum #language-Awngi #language-Awak #language-Awera #language-South Awyu #language-Araweté #language-Jair Awyu #language-Awara #language-Edera Awyu #language-Abipon #language-Ayerrerenge #language-Mato Grosso Arára #language-Yaka (Central African Republic) #language-Lower Southern Aranda #language-Middle Armenian #language-Xârâgurè #language-Awar #language-Ayizo Gbe #language-Southern Aymara #language-Ayabadhu #language-Ayere #language-Ginyanga #language-Hadrami Arabic #language-Leyigha #language-Libyan Arabic #language-Aymara #language-Sanaani Arabic #language-Ayoreo #language-Central Aymara #language-Ayu #language-Mai Brat #language-Azha #language-South Azerbaijani #language-Eastern Durango Nahuatl #language-San Pedro Amuzgos Amuzgo #language-North Azerbaijani #language-Ipalapa Amuzgo #language-Western Durango Nahuatl #language-Awing #language-Highland Puebla Nahuatl #language-Babatana #language-Bainouk-Gunyuño #language-Baré #language-Nubaca #language-Tuki #language-Barakai #language-Bashkir #language-Baluchi #language-Bambara #language-Balinese #language-Waimaha #language-Bantawa #language-Bavarian #language-Basa (Cameroon) #language-Bada (Nigeria) #language-Vengo #language-Bambili-Bambui #language-Bamun #language-Baatonum #language-Barai #language-Batak Toba #language-Bau #language-Bangba #language-Baibai #language-Bugan #language-Barombi #language-Ghomálá' #language-Babanki #language-Bats #language-Uneapa #language-West Central Banda #language-Bamali #language-Girawa #language-Bakpinka #language-Mburku #language-Kulung (Nigeria) #language-Karnai #language-Baba #language-Befang #language-Central Bai #language-Southern Balochi #language-North Babar #language-Bamenyam #language-Bamu #language-Baga Pokur #language-Bariai #language-Baoulé #language-Bardi #language-Bunuba #language-Central Bikol #language-Bannoni #language-Bali (Nigeria) #language-Kaluli #language-Bali (Democratic Republic of Congo) #language-Bench #language-Babine #language-Kohumono #language-Awad Bing #language-Shoo-Minda-Nye #language-Bana #language-Bacama #language-Bainouk-Gunyaamolo #language-Bayot #language-Basap #language-Emberá-Baudó #language-Bunama #language-Bade #language-Bonggi #language-Baka (South Sudan) #language-Burun #language-Bai (South Sudan) #language-Budukh #language-Indonesian Bajau #language-Buduma #language-Baldemu #language-Bende #language-Bahnar #language-West Coast Bajau #language-Burunge #language-Oroko #language-Baham #language-Budong-Budong #language-Bandjalang #language-Beaver #language-Bebele #language-Iceve-Maci #language-Bedoanas #language-Byangsi #language-Benabena #language-Belait #language-Biali #language-Bekati' #language-Beja #language-Bebeli #language-Belarusian #language-Bemba (Zambia) #language-Bengali #language-Beami #language-Besoa #language-Beembe #language-Guiberoua Béte #language-Blagar #language-Daloa Bété #language-Betawi #language-Jur Modo #language-Beli (Papua New Guinea) #language-Bena (Tanzania) #language-Bari #language-Panyi Bai #language-Bafut #language-Bofi #language-Busang Kayan #language-Blafe #language-Bafanji #language-Banda-Ndélé #language-Mmen #language-Bunak #language-Beba #language-Badaga #language-Southern Bai #language-Balti #language-Gahri #language-Bondo #language-Bagheli #language-Mahasu Pahari #language-Haryanvi #language-Rathwi Bareli #language-Bangandu #language-Bugun #language-Giangan #language-Bangolan #language-Bit #language-Bo (Laos) #language-Western Balochi #language-Baga Koga #language-Eastern Balochi #language-Bagri #language-Bawm Chin #language-Tagabawa #language-Bughotu #language-Mbongno #language-Warkay-Bipim #language-Banggai #language-Bhili #language-Biga #language-Bhadrawahi #language-Odiai #language-Binandere #language-Bukharic #language-Bahing #language-Bimin #language-Bohtan Neo-Aramaic #language-Bhojpuri #language-Bima #language-Tukang Besi South #language-Bara Malagasy #language-Bhattiyali #language-Biak #language-Bhele #language-Bada (Indonesia) #language-Badimaya #language-Bissa #language-Bidiyo #language-Bepour #language-Biafada #language-Biangai #language-Bile #language-Bimoba #language-Bini #language-Nai #language-Bipi #language-Bisorio #language-Bislama #language-Berinomo #language-Biete #language-Southern Birifor #language-Kol (Cameroon) #language-Birhor #language-Baloi #language-Budza #language-Banggarla #language-Bariji #language-Biao-Jiao Mien #language-Barzani Jewish Neo-Aramaic #language-Bidyogo #language-Bahinemo #language-Burji #language-Barok #language-Bulu (Papua New Guinea) #language-Bajelani #language-Banjar #language-Fanamaket #language-Binumarien #language-Bajan #language-Balanta-Ganja #language-Bedjond #language-Bakwé #language-Bayali #language-Baruga #language-Baka (Cameroon) #language-Binukid #language-Bakoko #language-Baki #language-Pande #language-Brokskat #language-Berik #language-Kom (Cameroon) #language-Bukitan #language-Kwa' #language-Bakairí #language-Bakumpai #language-Northern Sorsoganon #language-Buhid #language-Bekwarra #language-Bekwel #language-Baikeno #language-Bokyi #language-Bungku #language-Siksika #language-Bilua #language-Bella Coola #language-Bolango #language-Balanta-Kentohe #language-Buol #language-Kuwaa #language-Bolia #language-Bolongan #language-Pa'o Karen #language-Biloxi #language-Beli (South Sudan) #language-Southern Catanduanes Bikol #language-Blablanga #language-Baluan-Pam #language-Blang #language-Balaesang #language-Tai Dam #language-Kibala #language-Balangao #language-Mag-Indi Ayta #language-Notre #language-Balantak #language-Bembe #language-Biem #language-Baga Manduri #language-Limassa #language-Bom-Kim #language-Bamwe #language-Kein #language-Bagirmi #language-Ghayavi #language-Northern Betsimisaraka Malagasy #language-Bina (Papua New Guinea) #language-Bambalang #language-Muinane #language-Biao Mon #language-Somba-Siawari #language-Bum #language-Baimak #language-Baramu #language-Bonerate #language-Bookan #language-Bontok #language-Banda (Indonesia) #language-Bintauna #language-Masiwang #language-Benga #language-Bangi #language-Eastern Tawbuid #language-Bierebo #language-Boon #language-Batanga #language-Bunun #language-Bantoanon #language-Bola #language-Bantik #language-Butmas-Tur #language-Bundeli #language-Bonerif #language-Bisis #language-Bangubangu #language-Bintulu #language-Beezen #language-Bora #language-Aweer #language-Tibetan #language-Mundabli #language-Bolon #language-Boma #language-Barbareño #language-Anjam #language-Bole #language-Berom #language-Bine #language-Bonkiman #language-Bogaya #language-Borôro #language-Bosnian #language-Bongo #language-Bondei #language-Tuwuli #language-Buamu #language-Bodo (Central African Republic) #language-Tiéyaxo Bozo #language-Daakaka #language-Bonggo #language-Botlikh #language-Bagupi #language-Binji #language-Orowe #language-Biyom #language-Dzao Min #language-Kaure #language-Banda Malay #language-Koronadal Blaan #language-Sarangani Blaan #language-Bongu #language-Bian Marind #language-Bo (Papua New Guinea) #language-Bishnupriya #language-Bilba #language-Boko (Benin) #language-Bago-Kusuntu #language-Bakhtiari #language-Bilakura #language-Busa #language-Biritai #language-Burusu #language-Bosngun #language-Bamukumbit #language-Koro Wachi #language-Buru (Nigeria) #language-Bakaka #language-Brao #language-Berbice Creole Dutch #language-Baraamu #language-Breton #language-Bira #language-Baure #language-Brahui #language-Mokpwe #language-Bieria #language-Birked #language-Birwa #language-Barambu #language-Boruca #language-Barapasi #language-Breri #language-Birao #language-Bitare #language-Eastern Bru #language-Western Bru #language-Bodo (India) #language-Bilbil #language-Brunei Bisaya #language-Bassari #language-Wushi #language-Bauchi #language-Bashkardi #language-Kati #language-Bassossi #language-Burushaski #language-Basa-Gumna #language-Busami #language-Barasana-Eduria #language-Baga Sitemu #language-Bassa #language-Akoose #language-Basketo #language-Bahonsuai #language-Baiso #language-Yangkam #language-Sabah Bisaya #language-Bata #language-Bati (Cameroon) #language-Batak Dairi #language-Gamo-Ningi #language-Birgit #language-Biatah Bidayuh #language-Bacanese Malay #language-Batak Mandailing #language-Ratagnon #language-Rinconada Bikol #language-Budibud #language-Batek #language-Baetora #language-Batak Simalungun #language-Bete-Bendi #language-Batu #language-Bateri #language-Butuanon #language-Batak Karo #language-Bobot #language-Batak Alas-Kluet #language-Buriat #language-Bua #language-Bushi #language-Ntcham #language-Beothuk #language-Bushoong #language-Buginese #language-Younuo Bunu #language-Bugawac #language-Bulgarian #language-Bulu (Cameroon) #language-Sherbro #language-Terei #language-Busoa #language-Brem #language-Bokobaru #language-Bungain #language-Bubi #language-Boghom #language-Bullom So #language-Barein #language-Bube #language-Baelelea #language-Baeggu #language-Bonkeng #language-Belanda Viri #language-Baan #language-Bukat #language-Bamunka #language-Buna #language-Bolgo #language-Birri #language-Burarra #language-Baniva #language-Boga #language-Dibole #language-Bauzi #language-Bwatoo #language-Namosi-Naitasiri-Serua #language-Bwaidoka #language-Bwe Karen #language-Boselewa #language-Barwe #language-Baniwa #language-Bauwaki #language-Bwela #language-Biwat #language-Wunai Bunu #language-Boro (Ethiopia) #language-Mandobo Bawah #language-Southern Bobo Madaré #language-Bura-Pabir #language-Bomboma #language-Bafaw-Balong #language-Buli (Ghana) #language-Bwa #language-Bu-Nao Bunu #language-Tairaha #language-Belanda Bor #language-Pela #language-Birale #language-Bilur #language-Bangala #language-Buhutu #language-Pirlatapa #language-Bayungu #language-Bukusu #language-Jalkunan #language-Mongolia Buriat #language-Burduna #language-Beele #language-Russia Buriat #language-China Buriat #language-Berakou #language-Bankagooma #language-Binahari #language-Batak #language-Benyadu' #language-Pouye #language-Bete #language-Buyu #language-Biao #language-Bayono #language-Bidjara #language-Bilin #language-Biyo #language-Bumaji #language-Basay #language-Baruya #language-Burak #language-Berti #language-Medumba #language-Belhariya #language-Qaqet #language-Banaro #language-Bandi #language-Andio #language-Southern Betsimisaraka Malagasy #language-Bribri #language-Jenaama Bozo #language-Boikin #language-Babuza #language-Mapos Buang #language-Bisu #language-Belize Kriol English #language-Nicaragua Creole English #language-Boano (Sulawesi) #language-Boano (Maluku) #language-Kemberano #language-Buli (Indonesia) #language-Biri #language-Brithenig #language-Burmeso #language-Naami #language-Basa (Nigeria) #language-Kɛlɛngaxo Bozo #language-Obanliku #language-Evant #language-Chortí #language-Garifuna #language-Chuj #language-Caddo #language-Lehar #language-Southern Carrier #language-Nivaclé #language-Cahuarano #language-Chané #language-Kaqchikel #language-Carolinian #language-Cemuhî #language-Chambri #language-Chácobo #language-Chipaya #language-Car Nicobarese #language-Galibi Carib #language-Tsimané #language-Catalan #language-Cavineña #language-Callawalla #language-Chiquitano #language-Cayuga #language-Canichana #language-Cabiyarí #language-Carapana #language-Carijona #language-Chimila #language-Chachi #language-Ede Cabe #language-Chavacano #language-Nyahkur #language-Izora #language-Cashibo-Cacataibo #language-Cashinahua #language-Chayahuita #language-Candoshi-Shapra #language-Cacua #language-Carabayo #language-Chamicuro #language-Chopi #language-Samba Daka #language-Atsam #language-Kasanga #language-Comaltepec Chinantec #language-Chakma #language-Cacaopera #language-Chiru #language-Chambeali #language-Churahi #language-Chepang #language-Chaudangsi #language-Min Dong Chinese #language-Cinda-Regi-Tiyal #language-Chadong #language-Koda #language-Lower Chehalis #language-Cebuano #language-Chamacoco #language-Czech #language-Centúúm #language-Dijim-Bwilim #language-Cara #language-Como Karim #language-Falam Chin #language-Kagayanen #language-Chiga #language-Chamorro #language-Chibcha #language-Catawba #language-Highland Oaxaca Chontal #language-Chechen #language-Tabasco Chontal #language-Chagatai #language-Chinook #language-Ojitlán Chinantec #language-Chuukese #language-Cahuilla #language-Mari (Russia) #language-Chinook jargon #language-Choctaw #language-Chipewyan #language-Quiotepec Chinantec #language-Cherokee #language-Cholón #language-Church Slavic #language-Chuvash #language-Chuwabu #language-Chantyal #language-Cheyenne #language-Ozumacín Chinantec #language-Cia-Cia #language-Chickasaw #language-Chimariko #language-Cineni #language-Chitkuli Kinnauri #language-Cimbrian #language-Cinta Larga #language-Chiapanec #language-Tiri #language-Chippewa #language-Chaima #language-Western Cham #language-Chru #language-Upper Chehalis #language-Chamalal #language-Chokwe #language-Eastern Cham #language-Chenapian #language-Ashéninka Pajonal #language-Cabécar #language-Shor #language-Chuave #language-Jinyu Chinese #language-Central Kurdish #language-Chak #language-Cibak #language-Anufo #language-Kajakse #language-Tayo #language-Chukot #language-Koasati #language-Kavalan #language-Caka #language-Ron #language-Chilcotin #language-Chaldean Neo-Aramaic #language-Lealao Chinantec #language-Chilisso #language-Chakali #language-Idu-Mishmi #language-Chala #language-Clallam #language-Lowland Oaxaca Chontal #language-Chulym #language-Eastern Highland Chatino #language-Cerma #language-Classical Mongolian #language-Emberá-Chamí #language-Campalagian #language-Mandarin Chinese #language-Central Mnong #language-Chinbon Chin #language-Northern Qiang #language-Hakha Chin #language-Asháninka #language-Khumi Chin #language-Lalana Chinantec #language-Northern Ping Chinese #language-Montenegrin #language-Central Asmat #language-Tepetotutla Chinantec #language-Chenoua #language-Middle Cornish #language-Chicomuceltec #language-Cocopa #language-Cocama-Cocamilla #language-Koreguaje #language-Colorado #language-Chong #language-Chonyi-Dzihana-Kauma #language-Cochimi #language-Santa Teresa Cora #language-Columbia-Wenatchi #language-Comanche #language-Cofán #language-Comox #language-Coptic #language-Coquille #language-Cornish #language-Corsican #language-Caquinte #language-Wamey #language-Cowlitz #language-Chochotec #language-Palantla Chinantec #language-Ucayali-Yurúa Ashéninka #language-Ajyíninka Apurucayali #language-Cappadocian Greek #language-Chinese Pidgin English #language-Cherepon #language-Capiznon #language-Pichis Ashéninka #language-Pu-Xian Chinese #language-Chuanqiandian Cluster Miao #language-Chara #language-Island Carib #language-Lonwolwol #language-Coeur d'Alene #language-Cree #language-Michif #language-Crimean Tatar #language-Sãotomense #language-Southern East Cree #language-Plains Cree #language-Northern East Cree #language-Moose Cree #language-El Nayar Cora #language-Crow #language-Iyo'wujwa Chorote #language-Carolina Algonquian #language-Seselwa Creole French #language-Iyojwa'ja Chorote #language-Chrau #language-Carrier #language-Cori #language-Cruzeño #language-Chiltepec Chinantec #language-Kashubian #language-Asho Chin #language-Coast Miwok #language-Central Sierra Miwok #language-Sochiapam Chinantec #language-Southern Ping Chinese #language-Southern Ohlone #language-Northern Ohlone #language-Swampy Cree #language-Siyin Chin #language-Coos #language-Tataltepec Chatino #language-Chetco #language-Tedim Chin #language-Tepinapa Chinantec #language-Chittagonian #language-Tlacoatzintepec Chinantec #language-Chitimacha #language-Chhintange #language-Emberá-Catío #language-Western Highland Chatino #language-Northern Catanduanes Bikol #language-Chol #language-Zacatepec Chatino #language-Cua #language-Cubeo #language-Usila Chinantec #language-Chuka #language-Cuiba #language-Mashco Piro #language-San Blas Kuna #language-Culina #language-Cumanagoto #language-Cupeño #language-Cun #language-Chhulung #language-Teutila Cuicatec #language-Tai Ya #language-Cuvok #language-Tepeuxila Cuicatec #language-Cuitlatec #language-Chug #language-Valle Nacional Chinantec #language-Woods Cree #language-Kwere #language-Chewong #language-Kuwaataay #language-Nopala Chatino #language-Cayubaba #language-Welsh #language-Cuyonon #language-Huizhou Chinese #language-Knaanic #language-Zenzontepec Chatino #language-Min Zhong Chinese #language-Zotung Chin #language-Dangaléat #language-Dambi #language-Marik #language-Dagbani #language-Gwahatike #language-Day #language-Dar Fur Daju #language-Dakota #language-Dahalo #language-Damakawa #language-Danish #language-Dandami Maria #language-Dargwa #language-Dar Sila Daju #language-Taita #language-Davawenyo #language-Dayi #language-Bangime #language-Deno #language-Dadiya #language-Dabe #language-Edopi #language-Dogul Dom Dogon #language-Doka #language-Ida'an #language-Dyirbal #language-Duguri #language-Duriankere #language-Duwai #language-Daba #language-Ben Tey Dogon #language-Bondum Dom Dogon #language-Bankan Tey Dogon #language-Dibiyaso #language-Deccan #language-Negerhollands #language-Dadi Dadi #language-Dongotono #language-Fataluku #language-West Goodenough #language-Jaru #language-Dendi (Benin) #language-Dido #language-Donno So Dogon #language-Dawera-Daweloor #language-Dagik #language-Dedua #language-Dewoin #language-Degema #language-Demisa #language-Delaware #language-Dem #language-Slave (Athapascan) #language-Pidgin Delaware #language-Deori #language-Desano #language-German #language-Domung #language-Southern Dagaare #language-Bunoge Dogon #language-Casiguran Dumagat Agta #language-Dagaari Dioula #language-Degenan #language-Doga #language-Dghwede #language-Northern Dagara #language-Andaandi #language-Dogri (individual language) #language-Dogrib #language-Ndra'ngith #language-Daungwurrung #language-Daga #language-Dhundari #language-Dhangu-Djangu #language-Dhimal #language-Dhalandji #language-Dhargari #language-Dhaiso #language-Dhurga #language-Dehu #language-Dia #language-South Central Dinka #language-Lakota Dida #language-Didinga #language-Dieri #language-Digo #language-Kumiai #language-Dimbong #language-Dai #language-Southwestern Dinka #language-Dilling #language-Dime #language-Dinka #language-Dibo #language-Northeastern Dinka #language-Dimli (individual language) #language-Dirim #language-Dimasa #language-Diriku #language-Dhivehi #language-Northwestern Dinka #language-Dixon Reef #language-Ding #language-Djadjawurrung #language-Djinba #language-Dar Daju Daju #language-Djamindjung #language-Zarma #language-Djinang #language-Djeebbana #language-Eastern Maroon Creole #language-Jamsay Dogon #language-Jawoyn #language-Djambarrpuyngu #language-Kapriman #language-Djawi #language-Kuijau #language-Southeastern Dinka #language-Dolgan #language-Dahalik #language-Dalmatian #language-Duma #language-Mombo Dogon #language-Gavak #language-Madhi Madhi #language-Dugwor #language-Medefaidrin #language-Upper Kinabatangan #language-Domaaki #language-Dameli #language-Dama #language-Kemedzung #language-East Damar #language-Dampelas #language-Dubu #language-Dumpas #language-Mudburra #language-Demta #language-Upper Grand Valley Dani #language-Daonda #language-Ndendeule #language-Dungan #language-Lower Grand Valley Dani #language-Dan #language-Dzùùngoo #language-Danaru #language-Mid Grand Valley Dani #language-Danau #language-Western Dani #language-Dení #language-Dom #language-Dobu #language-Northern Dong #language-Doe #language-Domu #language-Dong #language-Dogri (macrolanguage) #language-Dondo #language-Doso #language-Toura (Papua New Guinea) #language-Dongo #language-Dori'o #language-Dogosé #language-Dass #language-Doyayo #language-Bussa #language-Dompo #language-Dorze #language-Papar #language-Dair #language-Darmiya #language-Dolpo #language-Rungus #language-C'Lela #language-Paakantyi #language-West Damar #language-Daro-Matu Melanau #language-Gedeo #language-Rukai #language-Darai #language-Lower Sorbian #language-Daasanach #language-Dusner #language-Desiya #language-Tadaksahak #language-Daur #language-Labuk-Kinabatangan Kadazan #language-Ditidaht #language-Adithinngithigh #language-Ana Tinga Dogon #language-Tene Kan Dogon #language-Tomo Kan Dogon #language-Tommo So Dogon #language-Kadazan Dusun #language-Lotud #language-Toro So Dogon #language-Toro Tegu Dogon #language-Tebul Ure Dogon #language-Dotyali #language-Duala #language-Duna #language-Umiray Dumaget Agta #language-Dumbea #language-Duruma #language-Uyajitaya #language-Middle Dutch (ca. 1050-1350) #language-Dusun Deyah #language-Dupaninan Agta #language-Duano #language-Dusun Malang #language-Dii #language-Dumi #language-Drung #language-Duvle #language-Dusun Witu #language-Duungooma #language-Duau #language-Diri #language-Dawik Kui #language-Dawro #language-Dutton World Speedwords #language-Dhuwal #language-Dawawa #language-Dyan #language-Dyaberdyaber #language-Dyugun #language-Djimini Senoufo #language-Yanda Dom Dogon #language-Dyangadi #language-Jola-Fonyi #language-Dyula #language-Djabugay #language-Djiwarli #language-Dazaga #language-Dzongkha #language-Ebughu #language-Eastern Bontok #language-Teke-Ebo #language-Ebrié #language-Embu #language-Eteocypriot #language-E #language-Efai #language-Efik #language-Ega #language-Emilian #language-Eggon #language-Egyptian (Ancient) #language-Ehueun #language-Eipomek #language-Eitiep #language-Askopan #language-Ejamat #language-Ekajuk #language-Ekit #language-Ekari #language-Standard Estonian #language-Kol (Bangladesh) #language-Elip #language-Koti #language-Ekpeye #language-Yace #language-Eastern Kayah #language-Elepi #language-Nding #language-Elkei #language-Modern Greek (1453-) #language-Eleme #language-El Molo #language-Elamite #language-Emai-Iuleha-Ora #language-Embaloh #language-Emerillon #language-Eastern Meohang #language-Mussau-Emira #language-Eastern Maninkakan #language-Eman #language-Northern Emberá #language-Pacific Gulf Yupik #language-Emplawas #language-Epigraphic Mayan #language-Apali #language-Markweeta #language-En #language-Ende #language-Forest Enets #language-English #language-Tundra Enets #language-Enlhet #language-Middle English (1100-1500) #language-Engenni #language-Enggano #language-Enga #language-Emumu #language-Enwan (Akwa Ibom State) #language-Enxet #language-Beti (Côte d'Ivoire) #language-Epie #language-Esperanto #language-Sie #language-Eruwa #language-Ogea #language-South Efate #language-Horpa #language-Erre #language-Ersu #language-Eritai #language-Erokwanas #language-Ese Ejja #language-Aheri Gondi #language-Eshtehardi #language-North Alaskan Inupiatun #language-Northwest Alaska Inupiatun #language-Estonian Sign Language #language-Esselen #language-Central Siberian Yupik #language-Central Yupik #language-Etebi #language-Etchemin #language-Eton (Vanuatu) #language-Eton (Cameroon) #language-Edolo #language-Yekhee #language-Etruscan #language-Ejagham #language-Eten #language-Semimi #language-Basque #language-Even #language-Uvbie #language-Evenki #language-Ewe #language-Ewondo #language-Extremaduran #language-Eyak #language-Keiyo #language-Fasu #language-Fa d'Ambu #language-Wagi #language-Fagani #language-Finongan #language-Faiwol #language-Faita #language-Fang (Cameroon) #language-South Fali #language-Fam #language-Fang (Equatorial Guinea) #language-Faroese #language-Paloor #language-Fataleka #language-Fanti #language-Fayu #language-Fala #language-Southwestern Fars #language-West Albay Bikol #language-Feroge #language-Maasina Fulfulde #language-Fongoro #language-Nobiin #language-Fyer #language-Fijian #language-Filipino #language-Finnish #language-Fipa #language-Firan #language-Tornedalen Finnish #language-Kirya-Konzəl #language-Kven Finnish #language-Kalispel-Pend d'Oreille #language-Foau #language-Fali #language-North Fali #language-Flinders Island #language-Fuliiru #language-Flaaitaal #language-Fe'fe' #language-Fanagalo #language-Fania #language-Foodo #language-Foi #language-Foma #language-Fon #language-Fore #language-Siraya #language-Fernando Po Creole English #language-Fas #language-French #language-Cajun French #language-Fordata #language-Frankish #language-Middle French (ca. 1400-1600) #language-Old French (842-ca. 1400) #language-Arpitan #language-Forak #language-Northern Frisian #language-Fortsenal #language-Western Frisian #language-Adamawa Fulfulde #language-Pulaar #language-East Futuna #language-Pular #language-Western Niger Fulfulde #language-Ko #language-Fulah #language-Fulniô #language-Friulian #language-Futuna-Aniwa #language-Furu #language-Nigerian Fulfulde #language-Fuyug #language-Fur #language-Fwâi #language-Ga #language-Gabri #language-Mixed Great Andamanese #language-Gaddang #language-Guarequena #language-Gende #language-Gagauz #language-Alekano #language-Borei #language-Gadsup #language-Galolen #language-Kandawo #language-Gan Chinese #language-Gants #language-Gal #language-Gata' #language-Galeya #language-Kenati #language-Nobonob #language-Borana-Arsi-Guji Oromo #language-Gayo #language-West Central Oromo #language-Gbaya (Central African Republic) #language-Kaytetye #language-Karajarri #language-Niksek #language-Gbanziri #language-Galela #language-Bodo Gadaba #language-Garhwali #language-Gbaya-Bossangoa #language-Gbagyi #language-Gagadu #language-Gabi-Gabi #language-Eastern Xwla Gbe #language-Gbari #language-Zoroastrian Dari #language-Mali #language-Ganggalida #language-Galice #language-Guadeloupean Creole French #language-Grenadian Creole English #language-Gaina #language-Guianese Creole French #language-Pottangi Ollar Gadaba #language-Gugu Badhun #language-Gedaged #language-Gude #language-Guduf-Gava #language-Ga'dang #language-Gadjerawang #language-Gundi #language-Gurdjar #language-Dirasha #language-Laal #language-Umanakaina #language-Ghodoberi #language-Mehri #language-Wipi #language-Gudu #language-Godwari #language-Geruma #language-Kire #language-Gade #language-Gengle #language-Hutterite German #language-Gebe #language-Gen #language-Ywom #language-ut-Ma'in #language-Geme #language-Geser-Gorom #language-Eviya #language-Gera #language-Enya #language-Geez #language-Patpatar #language-Gafat #language-Gao #language-Gbii #language-Gugadj #language-Gurr-goni #language-Kungarakany #language-Ganglau #language-Gitua #language-Gagu #language-Gogodala #language-Ghadamès #language-Geko Karen #language-Ghulfan #language-Ghanongga #language-Ghomara #language-Guhu-Samane #language-Kuke #language-Kija #language-Gail #language-Gidar #language-Gilbertese #language-Gimi (Eastern Highlands) #language-Hinukh #language-Gimi (West New Britain) #language-Green Gelao #language-Red Gelao #language-North Giziga #language-Gitxsan #language-White Gelao #language-Gilima #language-Giyug #language-South Giziga #language-Gunditjmara #language-Gonja #language-Gujari #language-Ndai #language-Gokana #language-Guinea Kpelle #language-ǂUngkue #language-Scottish Gaelic #language-Bon Gula #language-Nanai #language-Irish #language-Galician #language-Northwest Pashai #language-Gula Iro #language-Gilaki #language-Galambu #language-Manx #language-Glavda #language-Gule #language-Gambera #language-Gula'alaa #language-Middle High German (ca. 1050-1500) #language-Middle Low German #language-Gbaya-Mbodomo #language-Gumalu #language-Gamo #language-Mycenaean Greek #language-Kaansa #language-Guanche #language-Zulgo-Gemzek #language-Ganang #language-Ngangam #language-Lere #language-Gooniyandi #language-ǁGana #language-Gangulu #language-Ginuman #language-Gumatj #language-Northern Gondi #language-Gana #language-Gureng Gureng #language-Gnau #language-Guro #language-Playero #language-Gorakor #language-Godié #language-Gofa #language-Gogo #language-Old High German (ca. 750-1050) #language-Gobasi #language-Gola #language-Goan Konkani #language-Gondi #language-Yeretuar #language-Gorontalo #language-Gronings #language-Gothic #language-Gavar #language-Gorowa #language-Gobu #language-Gupa-Abawa #language-Ghanaian Pidgin English #language-Taiap #language-Ga'anda #language-Guiqiong #language-Guana (Brazil) #language-Gor #language-Qau #language-Rajput Garasia #language-Grebo #language-Ancient Greek (to 1453) #language-Guruntum-Mbaaru #language-Madi #language-Gbiri-Niragu #language-Ghari #language-Guarani #language-Groma #language-Gorovu #language-Taznatit #language-Gresi #language-Garo #language-Kistane #language-Guriaso #language-Gusilay #language-Nema #language-Southwest Gbaya #language-Wasembo #language-Swiss German #language-Guató #language-Aghu-Tharnggala #language-Shiki #language-Guajajára #language-Wayuu #language-Yocoboué Dida #language-Gurindji #language-Gupapuyngu #language-Paraguayan Guaraní #language-Guahibo #language-Eastern Bolivian Guaraní #language-Gujarati #language-Gumuz #language-Sea Island Creole English #language-Guambiano #language-Mbyá Guaraní #language-Guayabero #language-Gunwinggu #language-Aché #language-Farefare #language-Maléku Jaíka #language-Yanomamö #language-Gourmanchéma #language-Gusii #language-Guana (Paraguay) #language-Guanano #language-Duwet #language-Golin #language-Guajá #language-Gulay #language-Gurmana #language-Kuku-Yalanji #language-Gavião Do Jiparaná #language-Pará Gavião #language-Gurung #language-Gumawana #language-Mbato #language-Gawri #language-Gawwada #language-Gweno #language-Gowro #language-Gwichʼin #language-ǀGwi #language-Gwandara #language-Gwere #language-Gawar-Bati #language-Guwamu #language-Kwini #language-Gua #language-Wè Southern #language-Northwest Gbaya #language-Garus #language-Kayardild #language-Gyem #language-Gayil #language-Ngäbere #language-Guyanese Creole English #language-Gyalsumdo #language-Guarayu #language-Gunya #language-Ganza #language-Gazi #language-Gane #language-Han #language-Gurani #language-Hatam #language-Eastern Oromo #language-Hanga #language-Hahon #language-Haida #language-Hajong #language-Hakka Chinese #language-Halang #language-Hewa #language-Hangaza #language-Hakö #language-Ha #language-Harari #language-Haisla #language-Haitian #language-Hausa #language-Havu #language-Hawaiian #language-Southern Haida #language-Haya #language-Hazaragi #language-Huba #language-Heiban #language-Ancient Hebrew #language-Serbo-Croatian #language-Habu #language-Andaman Creole Hindi #language-Huichol #language-Northern Haida #language-Hadiyya #language-Northern Qiandong Miao #language-Hebrew #language-Herdé #language-Helong #language-Hehe #language-Heiltsuk #language-Hemba #language-Herero #language-Haiǁom #language-Lamang #language-Hibito #language-Hidatsa #language-Fiji Hindi #language-Kamwe #language-Pamosu #language-Seit-Kaitetu #language-Hiligaynon #language-Hindi #language-Tsoa #language-Hittite #language-Hiw #language-Hixkaryána #language-Kahe #language-Hunde #language-Hunjara-Kaina Ke #language-Halia #language-Halbi #language-Hieroglyphic Luwian #language-Humburi Senni Songhay #language-Large Flowery Miao #language-Luopohe Hmong #language-Hmong #language-Hiri Motu #language-Eastern Qiandong Miao #language-Hmar #language-Southern Qiandong Miao #language-Hamtai #language-Hamap #language-Hmong Dô #language-Mina (Cameroon) #language-Southern Hindko #language-Chhattisgarhi #language-Hungu #language-ǁAni #language-Hani #language-Hmong Njua #language-Hanunoo #language-Northern Hindko #language-Caribbean Hindustani #language-Hung #language-Hoava #language-Mari (Madang Province) #language-Ho #language-Holma #language-Horom #language-Hobyót #language-Holikachuk #language-Hadothi #language-Holu #language-Holoholo #language-Hopi #language-Hote #language-Hovongan #language-Honi #language-Hozo #language-Hpon #language-Hrangkhol #language-Niwer Mil #language-Hre #language-Haruku #language-Haroi #language-Hértevin #language-Hruso #language-Croatian #language-Warwar Feni #language-Hunsrik #language-Harzani #language-Upper Sorbian #language-Xiang Chinese #language-Harsusi #language-Hoti #language-Minica Huitoto #language-Hadza #language-Hitu #language-Middle Hittite #language-Huambisa #language-ǂHua #language-Huaulu #language-San Francisco Del Mar Huave #language-Humene #language-Huachipaeri #language-Huilliche #language-Huli #language-Hulung #language-Hula #language-Hungana #language-Hungarian #language-Hu #language-Hupa #language-Tsat #language-Halkomelem #language-Huastec #language-Humla #language-Murui Huitoto #language-San Mateo Del Mar Huave #language-Hukumina #language-Nüpode Huitoto #language-Hunzib #language-Haitian Vodoun Culture Language #language-San Dionisio Del Mar Huave #language-Sabu #language-Santa María Del Mar Huave #language-Wané #language-Hawai'i Creole English #language-Hwana #language-Hya #language-Armenian #language-Western Armenian #language-Iaai #language-Iatmul #language-Purari #language-Iban #language-Ibibio #language-Iwaidja #language-Akpes #language-Ibanag #language-Ibaloi #language-Ibino #language-Igbo #language-Ibuoro #language-Ibu #language-Ibani #language-Ede Ica #language-Etkywan #language-Idakho-Isukha-Tiriki #language-Indo-Portuguese #language-Idon #language-Ede Idaca #language-Idi #language-Ido #language-Indri #language-Idaté #language-Idoma #language-Amganad Ifugao #language-Batad Ifugao #language-Ifè #language-Ifo #language-Tuwali Ifugao #language-Teke-Fuumu #language-Mayoyao Ifugao #language-Keley-I Kallahan #language-Ebira #language-Igede #language-Igala #language-Kanggape #language-Ignaciano #language-Isebe #language-Interglossa #language-Iha #language-Sichuan Yi #language-Thiin #language-Izon #language-Biseni #language-Ede Ije #language-Kalabari #language-Southeast Ijo #language-Eastern Canadian Inuktitut #language-Iko #language-Ikulu #language-Olulumo-Ikom #language-Inuinnaqtun #language-Inuktitut #language-Ikwere #language-Ik #language-Ikizu #language-Ila #language-Interlingue #language-Ilongot #language-Iloko #language-Iranun (Philippines) #language-Ili'uun #language-Ilue #language-Anamgura #language-Miluk #language-Imonda #language-Imbongu #language-Imroing #language-Marsian #language-Interlingua (International Auxiliary Language Association) #language-Inga #language-Indonesian #language-Degexit'an #language-Ingush #language-Jungle Inga #language-Minaean #language-Isinai #language-Inoke-Yate #language-Iñapari #language-Intha #language-Ineseño #language-Inor #language-Tuma-Irumu #language-Iowa-Oto #language-Ipili #language-Inupiaq #language-Ipiko #language-Iquito #language-Iresim #language-Irarutu #language-Rigwe #language-Iraqw #language-Irántxe #language-Irula #language-Kamberau #language-Iraya #language-Isabi #language-Isconahua #language-Isnag #language-Nkem-Nkum #language-Ishkashimi #language-Icelandic #language-Isoko #language-Istriot #language-Isu (Menchum Division) #language-Italian #language-Binongan Itneg #language-Southern Tidung #language-Itene #language-Inlaod Itneg #language-Judeo-Italian #language-Itelmen #language-Itu Mbon Uzo #language-Itonama #language-Iteri #language-Itawit #language-Ito #language-Itik #language-Itzá #language-Iu Mien #language-Ibatan #language-Ivatan #language-Iwam #language-Sepik Iwam #language-Ixcatec #language-Ixil #language-Mesaka #language-Yaka (Congo) #language-Ingrian #language-Izere #language-Izii #language-Jamamadí #language-Hyam #language-Popti' #language-Yabem #language-Jara #language-Jah Hut #language-Zazao #language-Jakun #language-Yalahatan #language-Jamaican Creole English #language-Yanyuwa #language-Yaqay #language-Yaur #language-Javanese #language-Jambi Malay #language-Yan-nhangu #language-Jawe #language-Judeo-Berber #language-Badjiri #language-Arandai #language-Barikewa #language-Nafusi #language-Lojban #language-Jabutí #language-Jukun Takum #language-Krymchak #language-Judeo-Tat #language-Jebero #language-Jerung #language-Jeh #language-Yei #language-Jeri Kuo #language-Yelmek #language-Dza #language-Jere #language-Manem #language-Jonkor Bourmataguil #language-Judeo-Georgian #language-Gwak #language-Ngomba #language-Jehai #language-Jina #language-Jibu #language-Tol #language-Bu (Kaduna State) #language-Jingulu #language-Jiiddu #language-Jilim #language-Jimi (Cameroon) #language-Jiamao #language-Jita #language-Youle Jinuo #language-Shuar #language-Bankal #language-Kaera #language-Mobwa Karen #language-Kubo #language-Paku Karen #language-Labir #language-Ngile #language-Dima #language-Zumbun #language-Machame #language-Yamdena #language-Jimi (Nigeria) #language-Jumli #language-Kamara #language-Mashi (Nigeria) #language-Western Juxtlahuaca Mixtec #language-Jangshung #language-Yangman #language-Janji #language-Yemsa #language-Jaunsari #language-Joba #language-Wojenaka #language-Jorá #language-Jowulu #language-Jewish Palestinian Aramaic #language-Japanese #language-Jaqaru #language-Jarai #language-Judeo-Arabic #language-Jiru #language-Japrería #language-Júma #language-Wannu #language-Jurchen #language-Worodougou #language-Hõne #language-Wapan #language-Jirel #language-Jumjum #language-Juang #language-Jiba #language-Hupdë #language-Jurúna #language-Jutish #language-Ju #language-Wãpha #language-Juray #language-Caribbean Javanese #language-Jiarong #language-Kara-Kalpak #language-Kabyle #language-Kachin #language-Adara #language-Ketangalan #language-Kajaman #language-Kara (Central African Republic) #language-Karekare #language-Jju #language-Kalanguya #language-Kalaallisut #language-Kamba (Kenya) #language-Kannada #language-Xaasongaxango #language-Bezhta #language-Capanahua #language-Kashmiri #language-Georgian #language-Kanuri #language-Katukína #language-Kawi #language-Kamayurá #language-Kazakh #language-Kaxuiâna #language-Kadiwéu #language-Kabardian #language-Kanju #language-Camsá #language-Kaptiau #language-Kari #language-Grass Koiari #language-Kanembu #language-Iwal #language-Kare (Central African Republic) #language-Keliko #language-Kabiyè #language-Kamano #language-Kafa #language-Kande #language-Abadi #language-Dera (Indonesia) #language-Kaiep #language-Ap Ma #language-Manga Kanuri #language-Duhwa #language-Khanty #language-Kawacha #language-Ngkâlmpw Kanum #language-Ukaan #language-Tyap #language-Kamantan #language-Kobiana #language-Kalanga #language-Kela (Papua New Guinea) #language-Gula (Central African Republic) #language-Nubi #language-Kinalakna #language-Kanga #language-Katla #language-Kaian #language-Kami (Tanzania) #language-Kete #language-Kabwari #language-Kachama-Ganjule #language-Korandje #language-Worimi #language-Kutu #language-Yankunytjatjara #language-Makonde #language-Mamusi #language-Tem #language-Kumam #language-Karamojong #language-Numèè #language-Tsikimba #language-Kagoma #language-Kunda #language-Kaningdon-Nindem #language-Koch #language-Karaim #language-Kuy #language-Kadaru #language-Koneraw #language-Kam #language-Kwaja #language-Kabuverdianu #language-Kélé #language-Keiga #language-Kerewe #language-Eastern Keres #language-Kpessi #language-Tese #language-Kei #language-Kekchí #language-Kela (Democratic Republic of Congo) #language-Kemak #language-Kenyang #language-Kakwa #language-Kera #language-Kugbo #language-Ket #language-Akebu #language-West Kewa #language-Kukna #language-Kukele #language-Kodava #language-Northwestern Kolami #language-Konda-Dora #language-Korra Koraga #language-Kota (India) #language-Koya #language-Kannada Kurumba #language-Kinnauri #language-Khunsari #language-Kuk #language-Koro (Côte d'Ivoire) #language-Korwa #language-Korku #language-Kachhi #language-Kumaoni #language-Koromfé #language-Koyaga #language-Kawe #language-Komering #language-Kube #language-Kusunda #language-Gamale Kham #language-Kaiwá #language-Kunggari #language-Krongo #language-Kaingang #language-Kamoro #language-Abun #language-Kumbainggar #language-Somyev #language-Kobol #language-Karas #language-Karon Dori #language-Kamaru #language-Kyerung #language-Khasi #language-Lü #language-Tukang Besi North #language-Korowai #language-Khuen #language-Khams Tibetan #language-Kehu #language-Kuturmi #language-Halh Mongolian #language-Lusi #language-Khmer #language-Khandesi #language-Khotanese #language-Kapori #language-Koyra Chiini Songhay #language-Kharia #language-Kasua #language-Khamti #language-Nkhumbi #language-Khvarshi #language-Khowar #language-Kele (Democratic Republic of Congo) #language-Keapara #language-Kim #language-Koalib #language-Kickapoo #language-Koshin #language-Kibet #language-Kimaama #language-Kilmeri #language-Kitsai #language-Kilivila #language-Kikuyu #language-Kariya #language-Karagas #language-Kinyarwanda #language-Kiowa #language-Kosadle #language-Kirghiz #language-Kis #language-Agob #language-Kirmanjki (individual language) #language-Kimbu #language-Northeast Kiwai #language-Kirikiri #language-Kisi #language-Mlap #language-Q'anjob'al #language-Coastal Konjo #language-Southern Kiwai #language-Kisar #language-Khmu #language-Khakas #language-Zabana #language-Khinalugh #language-Highland Konjo #language-Western Parbate Kham #language-Kháng #language-Kunjen #language-Pwo Eastern Karen #language-Western Keres #language-Kurudu #language-East Kewa #language-Kashaya #language-Ramopa #language-Erave #language-Bumthangkha #language-Kakanda #language-Kwerisa #language-Odoodee #language-Kakabe #language-Khün #language-Kagulu #language-Kako #language-Kokota #language-Kosarek Yale #language-Kiong #language-Karko #language-Gugubera #language-Giiwo #language-Teke-Kukuya #language-Guugu Yimidhirr #language-Kaska #language-Klamath-Modoc #language-Kiliwa #language-Kolbila #language-Gamilaraay #language-Kulung (Nepal) #language-Tagakaulo #language-Weliki #language-Kalumpang #language-Khalaj #language-Kagan Kalagan #language-Migum #language-Kalenjin #language-Kamasa #language-Rumu #language-Khaling #language-Kalasha #language-Nukna #language-Klao #language-Maskelynes #language-Tado #language-Koluwawa #language-Kalao #language-Kabola #language-Konni #language-Kimbundu #language-Southern Dong #language-Majukayang Kalinga #language-Kare (Papua New Guinea) #language-Kâte #language-Kalam #language-Kami (Nigeria) #language-Limos Kalinga #language-Tanudan Kalinga #language-Kom (India) #language-Awtuw #language-Kwoma #language-Kwama #language-Northern Kurdish #language-Kamasau #language-Kemtuik #language-Kanite #language-Karipúna Creole French #language-Komo (Democratic Republic of Congo) #language-Waboda #language-Khorasani Turkish #language-Dera (Nigeria) #language-Lubuagan Kalinga #language-Central Kanuri #language-Konda #language-Kankanaey #language-Mankanya #language-Koongo #language-Kanufi #language-Western Kanjobal #language-Kuranko #language-Kanamarí #language-Konkani (individual language) #language-Kono (Sierra Leone) #language-Kwanja #language-Kintaq #language-Kaningra #language-Kensiu #language-Panoan Katukína #language-Tabo #language-Kung-Ekoka #language-Kendayan #language-Kanyok #language-Kodi #language-Kacipo-Bale Suri #language-Kubi #language-Cogui #language-Koyo #language-Komi-Permyak #language-Konkani (macrolanguage) #language-Kol (Papua New Guinea) #language-Kongo #language-Konzo #language-Waube #language-Kota (Gabon) #language-Korean #language-Kosraean #language-Lagwan #language-Koke #language-Kudu-Camo #language-Koyukon #language-Korak #language-Curripaco #language-Kpelle #language-Komba #language-Kapingamarangi #language-Kplang #language-Karajá #language-Kpan #language-Kpala #language-Koho #language-Kepkiriwát #language-Ikposo #language-Korupun-Sela #language-Korafe-Yegha #language-Tehit #language-Karata #language-Kafoa #language-Komi-Zyrian #language-Kobon #language-Mountain Koiali #language-Koryak #language-Kupsabiny #language-Mum #language-Kovai #language-Doromu-Koki #language-Kalagan #language-Kakabai #language-Koitabu #language-Koromira #language-Kyenele #language-Kaonde #language-Eastern Krahn #language-Krenak #language-Kimaragang #language-Northern Kissi #language-Klias River Kadazan #language-Seroa #language-Okolod #language-Kandas #language-Mser #language-Koorete #language-Korana #language-Karkin #language-Karachay-Balkar #language-Kairui-Midiki #language-Panará #language-Koro (Vanuatu) #language-Kurama #language-Krio #language-Kinaray-A #language-Kerek #language-Karelian #language-Durop #language-Krung #language-Gbaya (Sudan) #language-Kurukh #language-Karon #language-Kryts #language-Sota Kanum #language-Shambala #language-Southern Kalinga #language-Kuanua #language-Kuni #language-Bafia #language-Kusaghe #language-Kölsch #language-Krisa #language-Uare #language-Kansa #language-Kumalu #language-Kumba #language-Kasiguranin #language-Kaba #language-Kwaami #language-Borong #language-Southern Kisi #language-Winyé #language-Kusu #language-S'gaw Karen #language-Kedang #language-Kodaku #language-Kambaata #language-Kokata #language-Nubri #language-Kalkutung #language-Karanga #language-North Muyu #language-Plapo Krumen #language-Kaniet #language-Kurti #language-Karitiâna #language-Kuot #language-Kaduo #language-South Muyu #language-Ketum #language-Kituba (Democratic Republic of Congo) #language-Eastern Katu #language-Kato #language-Kaxararí #language-Juǀʼhoan #language-Kuanyama #language-Kutep #language-'Auhelawa #language-Kuman (Papua New Guinea) #language-Kupa #language-Kuikúro-Kalapálo #language-Kuria #language-Kulere #language-Kumyk #language-Kunama #language-Kumukio #language-Kunimaipa #language-Karipuna #language-Kusaal #language-Kutenai #language-Upper Kuskokwim #language-Kukatja #language-Kuuku-Ya'u #language-Kunza #language-Bagvalal #language-Kubu #language-Kove #language-Kui (Indonesia) #language-Kalabakan #language-Kabalai #language-Kuni-Boazi #language-Komodo #language-Psikye #language-Kendem #language-Border Kuna #language-Dobel #language-Geba Karen #language-Kerinci #language-Yinbaw Karen #language-Kola #language-Wersing #language-Yintale Karen #language-Dâw #language-Kwa #language-Kwaio #language-Kwerba #language-Kwara'ae #language-Sara Kaba Deme #language-Kowiai #language-Awa-Cuaiquer #language-Kwanga #language-Kwakiutl #language-Kofyar #language-Kwangali #language-Kwomtari #language-Kodia #language-Kwese #language-Sara Kaba Náà #language-San Salvador Kongo #language-Kwadi #language-Kairiru #language-Krobu #language-Konso #language-Brunei #language-Manumanaw Karen #language-Karo (Ethiopia) #language-Keningau Murut #language-Kulfa #language-Northern Khmer #language-Kanowit-Tanjong Melanau #language-Kanoé #language-Kuvi #language-Konai #language-Kerewo #language-Kwaya #language-Kyaka #language-Krache #language-Kouya #language-Karok #language-Kiput #language-Karao #language-Kamayo #language-Kalapuya #language-Kpatili #language-Kelon #language-Kenga #language-Kuruáya #language-Baram Kayan #language-Kayagar #language-Western Kayah #language-Rapoisi #language-Kayabí #language-Western Karaboro #language-Kaibobo #language-Bondoukou Kulango #language-Kadai #language-Kosena #language-Da'a Kaili #language-Kikai #language-Kelabit #language-Kazukuru #language-Kayeli #language-Kais #language-Kokola #language-Kaningi #language-Kaidipang #language-Kaike #language-Karang #language-Sugut Dusun #language-Kayupulau #language-Karirí-Xocó #language-Kamarian #language-Kalabra #language-Southern Subanen #language-Lacandon #language-Ladino #language-Pattani #language-Lafofa #language-Rangi #language-Lahnda #language-Lambya #language-Lango (Uganda) #language-Lamba #language-Laru #language-Lao #language-Laka (Chad) #language-Qabiao #language-Larteh #language-Lama (Togo) #language-Latin #language-Laba #language-Lauje #language-Tiwa #language-Aribwatsa #language-Label #language-Lakkia #language-Lak #language-Tinani #language-Ladakhi #language-Central Bontok #language-Libon Bikol #language-Rmeet #language-Laven #language-Wampar #language-Lohorung #language-Lachi #language-Labu #language-Lavatbura-Lamusong #language-Tolaki #language-Lawangan #language-Lamalama #language-Lardil #language-Legenyem #language-Luchazi #language-Tungag #language-Western Lawa #language-Luhu #language-Lisabata-Nuniali #language-Kla-Dan #language-Dũya #language-Laari #language-Lemoro #language-Leelau #language-Kaan #language-Landoma #language-Láadan #language-Lega-Shabunda #language-Lala-Bisa #language-Leco #language-Lendu #language-Lyélé #language-Lelemi #language-Lenje #language-Lemio #language-Lengola #language-Leipon #language-Nomaande #language-Lenca #language-Lepcha #language-Lembena #language-Lenkau #language-Lese #language-Lesing-Gelimi #language-Kara (Papua New Guinea) #language-Lamma #language-Ledo Kaili #language-Luang #language-Lezghian #language-Lefa #language-Lingua Franca Nova #language-Lungga #language-Laghu #language-Lugbara #language-Laghuu #language-Lengilu #language-Lingarak #language-Wala #language-T'apo #language-Logba #language-Lengo #language-Pahi #language-Longgu #language-Laha (Viet Nam) #language-Laha (Indonesia) #language-Lahu Shi #language-Lhomi #language-Lahanan #language-Mlahsö #language-Lo-Toga #language-Lahu #language-West-Central Limba #language-Likum #language-Hlai #language-Nyindrou #language-Limbu #language-Ligbi #language-Lihir #language-Ligurian #language-Lillooet #language-Limburgan #language-Lingala #language-Liki #language-Sekpele #language-Libido #language-Liberian English #language-Lisu #language-Lithuanian #language-Logorik #language-Liv #language-Liabuku #language-Banda-Bambari #language-Libinza #language-Golpa #language-Rampi #language-Laiyolo #language-Li'o #language-Lampung Api #language-Lakalei #language-Kabras #language-Lakondê #language-Kenyi #language-Lakha #language-Laki #language-Laeko-Libuat #language-Lakon #language-Päri #language-Lakota #language-Kungkari #language-Lokoya #language-Lala-Roba #language-Lele (Guinea) #language-Ladin #language-Lele (Papua New Guinea) #language-Hermit #language-Lole #language-Teke-Laali #language-Lelak #language-Lilau #language-Lasalimu #language-Lele (Chad) #language-North Efate #language-Lau #language-East Limba #language-Merei #language-Limilngan #language-Pévé #language-Lamogai #language-Lambichhong #language-Lamkang #language-Hano #language-Lambadi #language-Lombard #language-Limbum #language-Lamalera #language-Lamenu #language-Lake Miwok #language-Laimbue #language-Lamboya #language-Langbashe #language-Mbalanhu #language-Lundayeh #language-Lanoh #language-Leningitij #language-South Central Banda #language-Lorediakarkar #language-Lamnso' #language-Longuda #language-Loloda #language-Lobi #language-Inonhan #language-Saluan #language-Logol #language-Logo #language-Laarim #language-Loma (Côte d'Ivoire) #language-Lou #language-Loko #language-Mongo #language-Loma (Liberia) #language-Malawi Lomwe #language-Lombo #language-Lopa #language-Téén #language-Loniu #language-Otuho #language-Louisiana Creole #language-Loke #language-Lozi #language-Lelepa #language-Lepki #language-Rara Bakati' #language-Northern Luri #language-Laurentian #language-Laragia #language-Lari #language-Laro #language-Larevat #language-Lemerig #language-Lasgerdi #language-Lishana Deni #language-Lusengo #language-Lish #language-Lashi #language-Saamia #language-Aruop #language-Late Middle Chinese #language-Latgalian #language-Leti (Indonesia) #language-Latundê #language-Luxembourgish #language-Luba-Lulua #language-Luba-Katanga #language-Aringa #language-Ludian #language-Luvale #language-Laua #language-Ganda #language-Luiseno #language-Luna #language-Olu'bo #language-Luimbi #language-Lunda #language-Luo (Kenya and Tanzania) #language-Lumbu #language-Lucumi #language-Laura #language-Lushai #language-Lushootseed #language-Luo (Cameroon) #language-Luyia #language-Maku'a #language-Lavukaleve #language-Standard Latvian #language-White Lachi #language-Eastern Lawa #language-Luwo #language-Lewo #language-Lyngngam #language-Luyana #language-Literary Chinese #language-Litzlitz #language-Laz #language-San Jerónimo Tecóatl Mazatec #language-Madurese #language-Bo-Rukul #language-Mafa #language-Magahi #language-Marshallese #language-Maithili #language-Jalapa De Díaz Mazatec #language-Makasar #language-Malayalam #language-Mam #language-Mandingo #language-Chiquihuitlán Mazatec #language-Marathi #language-Masai #language-San Francisco Matlatzinca #language-Huautla Mazatec #language-Sateré-Mawé #language-Mampruli #language-North Moluccan Malay #language-Central Mazahua #language-Western Bukidnon Manobo #language-Macushi #language-Dibabawon Manobo #language-Molale #language-Baba Malay #language-Mangseng #language-Ilianen Manobo #language-Nadëb #language-Maxakalí #language-Ombamba #language-Macaguán #language-Mbo (Cameroon) #language-Malayo #language-Maisin #language-Nukak Makú #language-Sarangani Manobo #language-Matigsalug Manobo #language-Mbula-Bwazza #language-Mbulungish #language-Maring #language-Mari (East Sepik Province) #language-Amoltepec Mixtec #language-Maca #language-Machiguenga #language-Bitur #language-Sharanahua #language-Matsés #language-Mapoyo #language-Maquiritari #language-Mese #language-Mvanip #language-Mbunda #language-Malaccan Creole Portuguese #language-Masana #language-Coatlán Mixe #language-Makaa #language-Ese #language-Menya #language-Mambai #language-Mengisa #language-Cameroon Mambila #language-Minanibai #language-Mawa (Chad) #language-Mpiemo #language-South Watut #language-Mawan #language-Mada (Nigeria) #language-Morigi #language-Male (Papua New Guinea) #language-Mbum #language-Maba (Chad) #language-Moksha #language-Maguindanaon #language-Mamvu #language-Mangbetu #language-Mayogo #language-Mbala #language-Mbole #language-Mandar #language-Maria (Papua New Guinea) #language-Mbere #language-Santa Lucía Monteverde Mixtec #language-Mbosi #language-Dizin #language-Male (Ethiopia) #language-Suruí Do Pará #language-Menka #language-Ikobi #language-Marra #language-Melpa #language-Mengen #language-Megam #language-Midob #language-Meyah #language-Mekeo #language-Central Melanau #language-Mangala #language-Mende (Sierra Leone) #language-Kedah Malay #language-Miriwoong #language-Merey #language-Meru #language-Mato #language-Motu #language-Mano #language-Maaka #language-Hassaniyya #language-Menominee #language-Pattani Malay #language-Bangka #language-Mba #language-Mendankwe-Nkwen #language-Morisyen #language-Naki #language-Mogofin #language-Matal #language-Wandala #language-Mefele #language-Putai #language-Marghi South #language-Cross River Mbembe #language-Mbe #language-Makassar Malay #language-Moba #language-Marrithiyel #language-Mokerang #language-Mbwela #language-Mandjak #language-Mulaha #language-Melo #language-Mayo #language-Mabaan #language-Middle Irish (900-1200) #language-Mararit #language-Morokodo #language-Moru #language-Mango #language-Maklew #language-Makhuwa-Meetto #language-Lijili #language-Mawes #language-Maleu-Kilenge #language-Mambae #language-Meta' #language-Eastern Magar #language-Malila #language-Mambwe-Lungu #language-Manda (Tanzania) #language-Mailu #language-Matengo #language-Matumbi #language-Mbunga #language-Mbugwe #language-Manda (India) #language-Mahongwe #language-Mocho #language-Mbugu #language-Besisi #language-Mamaa #language-Margu #language-Ma'di #language-Mogholi #language-Mungaka #language-Mauwake #language-Mócheno #language-Mashi (Zambia) #language-Mandan #language-Eastern Mari #language-Buru (Indonesia) #language-Mandahuaca #language-Digaro-Mishmi #language-Mbukushu #language-Maru #language-Ma'anyan #language-Mor (Mor Islands) #language-Miami #language-Atatláhuca Mixtec #language-Mi'kmaq #language-Mandaic #language-Ocotepec Mixtec #language-Mofu-Gudur #language-San Miguel El Grande Mixtec #language-Chayuco Mixtec #language-Abar #language-Mikasuki #language-Peñoles Mixtec #language-Alacatlatzala Mixtec #language-Minangkabau #language-Pinotepa Nacional Mixtec #language-Apasco-Apoala Mixtec #language-Mískito #language-Isthmus Mixe #language-Southern Puebla Mixtec #language-Cacaloxtepec Mixtec #language-Akoye #language-Mixtepec Mixtec #language-San Juan Colorado Mixtec #language-Northwest Maidu #language-Tu #language-Mwera (Nyasa) #language-Kim Mun #language-Mawak #language-Matukar #language-Medebur #language-Miship #language-Sauria Paharia #language-Karbi #language-Mahali #language-Mahican #language-Mbre #language-Siliput #language-Macedonian #language-Mawchi #language-Miya #language-Mak (China) #language-Dhatki #language-Mokilese #language-Mokole #language-Moklen #language-Kupang Malay #language-Moikodi #language-Bay Miwok #language-Malas #language-Silacayoapan Mixtec #language-Mafea #language-Kituba (Congo) #language-East Makian #language-Makasae #language-Malo #language-Cao Lan #language-Manambu #language-Mal #language-Malagasy #language-Mape #language-Miltu #language-Ilwana #language-Malua Bay #language-Mulam #language-Malango #language-Bargam #language-Western Maninkakan #language-Vame #language-Masalit #language-Maltese #language-To'abaita #language-Motlav #language-Moloko #language-Malfaxal #language-Mama #language-Momina #language-Maonan #language-Mae #language-Mundat #language-North Ambrym #language-Mehináku #language-Musar #language-Maii #language-Mamanwa #language-Mangga Buang #language-Siawi #language-Musak #language-Western Xiangxi Miao #language-Malalamai #language-Mmaala #language-Emae #language-Madak #language-Migaama #language-Mabaale #language-Mbula #language-Muna #language-Manchu #language-Mondé #language-Naba #language-Mundani #language-Eastern Mnong #language-Mono (Democratic Republic of Congo) #language-Manipuri #language-Munji #language-Mandinka #language-Tiale #language-Mapena #language-Min Bei Chinese #language-Minriq #language-Mono (USA) #language-Mansi #language-Rennell-Bellona #language-Mon #language-Manikion #language-Manyawa #language-Moni #language-Mwan #language-Mocoví #language-Mobilian #language-Innu #language-Mongondow #language-Mohawk #language-Mboi #language-Monzombo #language-Morori #language-Mangue #language-Mopán Maya #language-Mor (Bomberai Peninsula) #language-Moro #language-Mossi #language-Barí #language-Mogum #language-Mohave #language-Molima #language-Shekkacho #language-Mukulu #language-Mpoto #language-Malak Malak #language-Mangarrayi #language-Machinere #language-Majang #language-Marba #language-Maung #language-Mpade #language-Martu Wangka #language-Mbara (Chad) #language-Middle Watut #language-Yosondúa Mixtec #language-Mindiri #language-Migabac #language-Matís #language-Vangunu #language-Dadibi #language-Mian #language-Makuráp #language-Mungkip #language-Mapidian #language-Misima-Panaeati #language-Mapia #language-Mpi #language-Maba (Indonesia) #language-Mbuko #language-Mangole #language-Matepi #language-Momuna #language-Mamasa #language-Rajah Kabunsuwan Manobo #language-Mbelime #language-South Marquesan #language-Moronene #language-Modole #language-Manipa #language-Mander #language-West Makian #language-Mandari #language-Mosimo #language-Murupi #language-Mamuju #language-Manggarai #language-Pano #language-Mlabri #language-Marino #language-Maricopa #language-Elseng #language-Mising #language-Mara Chin #language-Maori #language-Western Mari #language-Hmwaveke #language-Mortlockese #language-Merlav #language-Cheke Holo #language-Mru #language-Morouas #language-North Marquesan #language-Maria (India) #language-Maragus #language-Marghi Central #language-Mono (Cameroon) #language-Mangareva #language-Maranao #language-Maremgi #language-Mandaya #language-Marind #language-Malay (macrolanguage) #language-Masbatenyo #language-Musey #language-Mekwei #language-Moraid #language-Masikoro Malagasy #language-Ma (Democratic Republic of Congo) #language-Mansaka #language-Molof #language-Agusan Manobo #language-Vurës #language-Mombum #language-West Masela #language-Musom #language-Mansoanka #language-Moresada #language-Aruamu #language-Momare #language-Cotabato Manobo #language-Munit #language-Mualang #language-Mono (Solomon Islands) #language-Murik (Papua New Guinea) #language-Una #language-Munggui #language-Maiwa (Papua New Guinea) #language-Moskona #language-Mbe' #language-Montol #language-Mator #language-Matagalpa #language-Totontepec Mixe #language-Wichí Lhamtés Nocten #language-Muong #language-Mewari #language-Yora #language-Mota #language-Tututepec Mixtec #language-Asaro'o #language-Nabi #language-Mundang #language-Mubi #language-Ajumbu #language-Media Lengua #language-Musgu #language-Mündü #language-Musi #language-Mugom #language-Multiple languages #language-Nyong #language-Malvi #language-Murle #language-Creek #language-Yaaku #language-Bo-Ung #language-Muyang #language-Mursi #language-Manam #language-Mattole #language-Mamboru #language-Marwari (Pakistan) #language-Peripheral Mongolian #language-Miyako #language-Minaveha #language-Marovo #language-Duri #language-Moere #language-Marau #language-Mpotovoro #language-Marfa #language-Tagal Murut #language-Meoswar #language-Indus Kohistani #language-Mesqan #language-Mwatebu #language-Juwal #language-Are #language-Mwera (Chimwera) #language-Murrinh-Patha #language-Aiklep #language-Mouk-Aria #language-Labo #language-Mirandese #language-Sar #language-Nyamwanga #language-Central Maewo #language-Kala Lagaw Ya #language-Mün Chin #language-Marwari #language-Moken #language-Mentawai #language-Hmong Daw #language-Tezoatlán Mixtec #language-Manyika #language-Modang #language-Mele-Fila #language-Malgbe #language-Mbangala #language-Mozarabic #language-Miju-Mishmi #language-Monumbo #language-Meramera #language-Moi (Indonesia) #language-Tlahuitoltepec Mixe #language-Juquila Mixe #language-Murik (Malaysia) #language-Jamiltepec Mixtec #language-Mada (Cameroon) #language-Metlatónoc Mixtec #language-Namo #language-Mahou #language-Southeastern Nochixtlán Mixtec #language-Central Masela #language-Burmese #language-Mbay #language-Myene #language-Bambassi #language-Manta #language-Makah #language-Mangayat #language-Mamara Senoufo #language-Moma #language-Me'en #language-Anfillo #language-Pirahã #language-Muniche #language-Mesmes #language-Mundurukú #language-Erzya #language-Muyuw #language-Masaaba #language-Macuna #language-Classical Mandaic #language-Tumzabt #language-Malimba #language-Morawa #language-Wichí Lhamtés Güisnay #language-Ixcatlán Mazatec #language-Manya #language-Nigeria Mambila #language-Mumuye #language-Mazanderani #language-Matipuhy #language-Movima #language-Mori Atas #language-Marúbo #language-Macanese #language-Mintil #language-Inapang #language-Manza #language-Deg #language-Maiadomu #language-Southern Nambikuára #language-Narak #language-Naka'ela #language-Nabak #language-Naga Pidgin #language-Nalu #language-Nakanai #language-Nalik #language-Ngan'gityemerri #language-Neapolitan #language-Khoekhoe #language-Iguta #language-Naasioi #language-Ca̱hungwa̱rya̱ #language-Nauru #language-Navajo #language-Nawuri #language-Nakwi #language-Ngarrindjeri #language-Coatepec Nahuatl #language-Nyemba #language-Ndoe #language-Chang Naga #language-Konyak Naga #language-Ngamo #language-Mao Naga #language-Ngarinyman #language-Nake #language-South Ndebele #language-Ngbaka Ma'bo #language-Kuri #language-Nkukoli #language-Nnam #language-Nggem #language-Numana #language-Rongmei Naga #language-Ngamambo #language-Southern Ngbandi #language-Ningera #language-Iyo #language-Central Nicobarese #language-Ponam #language-Nachering #language-Yale #language-Notsi #language-Nisga'a #language-Central Huasteca Nahuatl #language-Classical Nahuatl #language-Northern Puebla Nahuatl #language-Na-kara #language-Michoacán Nahuatl #language-Nambo #language-Nauna #language-Sibe #language-Ncane #language-Chothe Naga #language-Chumburung #language-Central Puebla Nahuatl #language-Natchez #language-Ndasa #language-Kenswei Nsei #language-Ndau #language-Nde-Nsele-Nta #language-North Ndebele #language-Ndengereko #language-Ndali #language-Samba Leko #language-Ndamba #language-Ndam #language-Ngundi #language-Ndonga #language-Ndombe #language-Ndoola #language-Low German #language-Ndunga #language-Dugun #language-Ndut #language-Nduga #language-Lutos #language-Ndogo #language-Toura (Côte d'Ivoire) #language-Nedebang #language-Nêlêmwa-Nixumwak #language-Negidal #language-Nyenkha #language-Neko #language-Nemi #language-Nengone #language-Ná-Meo #language-North Central Mixe #language-Yahadian #language-Nete #language-Neo #language-Nyaheun #language-Newari #language-Neyo #language-Nez Perce #language-Dhao #language-Ahwai #language-Ayiwo #language-Nafaanra #language-Mfumte #language-Ngbaka #language-Northern Ngbandi #language-Ngombe (Democratic Republic of Congo) #language-Ngemba #language-Ngbaka Manza #language-Nǁng #language-Ngizim #language-Ngie #language-Dalabon #language-Lomwe #language-Ngwo #language-Ngulu #language-Ngurimi #language-Engdewu #language-Gvoko #language-Kriang #language-Guerrero Nahuatl #language-Nagumi #language-Nggwahyi #language-Tibea #language-Ngungwel #language-Nhanda #language-Beng #language-Tabasco Nahuatl #language-Chiripá #language-Eastern Huasteca Nahuatl #language-Nhuwala #language-Tetelcingo Nahuatl #language-Zacatlán-Ahuacatlán-Tepetzintla Nahuatl #language-Morelos Nahuatl #language-Central Nahuatl #language-Takuu #language-Isthmus-Pajapan Nahuatl #language-Huaxcaleca Nahuatl #language-Naro #language-Ometepec Nahuatl #language-Noone #language-Temascaltepec Nahuatl #language-Western Huasteca Nahuatl #language-Isthmus-Mecayapan Nahuatl #language-Northern Oaxaca Nahuatl #language-Nias #language-Nakame #language-Ngandi #language-Niellim #language-Nek #language-Ngalakgan #language-Nyiha (Tanzania) #language-Nii #language-Ngaju #language-Nila #language-Nilamba #language-Ninzo #language-Nganasan #language-Nandi #language-Nimboran #language-Nimi #language-Southeastern Kolami #language-Niuean #language-Gilyak #language-Nimo #language-Ngiti #language-Ningil #language-Nzanyi #language-Nocte Naga #language-Lotha Naga #language-Gudanji #language-Njen #language-Njalgulgule #language-Angami Naga #language-Liangmai Naga #language-Ao Naga #language-Njerep #language-Ngadjunmaya #language-Njyem #language-Nyishi #language-Nkoya #language-Khoibu Naga #language-Nkongho #language-Koireng #language-Duke #language-Inpui Naga #language-Nekgini #language-Khezha Naga #language-Thangal Naga #language-Nakai #language-Nokuku #language-Nkonya #language-Niuatoputapu #language-Nukuoro #language-North Asmat #language-Bouna Kulango #language-Nkutu #language-Nkoroo #language-Nkari #language-Ngombale #language-Nalca #language-Dutch #language-East Nyala #language-Gela #language-Grangali #language-Nyali #language-Nihali #language-Ngul #language-Nchumbulu #language-Nahali #language-Nyamal #language-Nalögo #language-Maram Naga #language-Big Nambas #language-Ngam #language-Ndumu #language-Mzieme Naga #language-Tangkhul Naga (India) #language-Kwasio #language-Monsang Naga #language-Ngombe (Central African Republic) #language-Namakura #language-Ndemli #language-Manangba #language-ǃXóõ #language-Nimanbur #language-Letemboi #language-Namonuito #language-Northeast Maidu #language-Ngamini #language-Nimoa #language-Nama (Papua New Guinea) #language-Namuyi #language-Nawdm #language-Nyangumarta #language-Nande #language-Nancere #language-West Ambae #language-Ngaing #language-Maring Naga #language-Ngiemboon #language-Nyangatom #language-Nankina #language-Northern Rengma Naga #language-Namia #language-Norwegian Nynorsk #language-Wancho Naga #language-Ngindo #language-Narungga #language-Nanticoke #language-Dwang #language-Nugunu (Australia) #language-Southern Nuni #language-Nyangga #language-Nda'nda' #language-Woun Meu #language-Norwegian Bokmål #language-Nuk #language-Northern Thai #language-Nimadi #language-Nogai #language-Nomu #language-Nonuya #language-Nooksack #language-Nomlaki #language-nom #language-Old Norse #language-Numanggang #language-Ngongo #language-Norwegian #language-Nomatsiguenga #language-Ewage-Notu #language-Novial #language-Nyambo #language-Nayi #language-Phom Naga #language-Nepali (individual language) #language-Southeastern Puebla Nahuatl #language-Mondropolon #language-Pochuri Naga #language-Nipsan #language-Napu #language-Southern Nago #language-Kura Ede Nago #language-Ndom #language-N'Ko #language-Ngom #language-Nara #language-Southern Rengma Naga #language-Jèrriais #language-Narango #language-Chokri Naga #language-Ngarla #language-Ngarluma #language-Narom #language-Norn #language-Northern Kalapuya #language-Narua #language-Lala #language-Sangtam Naga #language-Lower Nossob #language-Nsenga #language-Ngasa #language-Naskapi #language-Sumi Naga #language-Nehan #language-Pedi #language-Northern Sierra Miwok #language-Nali #language-Tase Naga #language-Sierra Negra Nahuatl #language-Navut #language-Nsongo #language-Nasal #language-Nisenan #language-Natioro #language-Ngaanyatjarra #language-Ikoma-Nata-Isenye #language-Nateni #language-Ntomba #language-Northern Tepehuan #language-Natügu #language-Nottoway #language-Mantsi #language-Natanzi #language-Yuanga #language-Nukuini #language-Ngala #language-Nusu #language-Nungali #language-Ngumbi #language-Nyole #language-Nuu-chah-nulth #language-Nusa Laut #language-Niuafo'ou #language-Anong #language-Nguôn #language-Nupe-Nupe-Tako #language-Nukuria #language-Nuer #language-Nung (Viet Nam) #language-Nguluwan #language-Mehek #language-Nunggubuyu #language-Tlamacazapa Nahuatl #language-Nasarian #language-Nyokon #language-Nawathinehena #language-Nyabwa #language-Ngwe #language-Southwest Tanna #language-Nawaru #language-Nottoway-Meherrin #language-Nauete #language-Nage #language-Ngad'a #language-Nindi #language-South Nuaulu #language-Ngawun #language-Naxi #language-Ninggerum #language-Nafri #language-Nyanja #language-Nyangbo #language-Nyore #language-Nyengo #language-Giryama #language-Nyikina #language-Ama (Sudan) #language-Nyanga #language-Nyaneka #language-Nyeu #language-Nyamwezi #language-Nyankole #language-Nyoro #language-Nyang'i #language-Nayini #language-Nyungar #language-Nyawaygi #language-Nyungwe #language-Nyulnyul #language-Nganyaywana #language-Nyakyusa-Ngonde #language-Tigon Mbembe #language-Njebi #language-Nzadi #language-Nzima #language-Nzakara #language-Zeme Naga #language-Nzakambay #language-Nanga Dama Dogon #language-Orok #language-Oroch #language-Old Aramaic (up to 700 BCE) #language-Obispeño #language-Southern Bontok #language-Moabite #language-Obo Manobo #language-Old Burmese #language-Old Breton #language-Ocaina #language-Old Chinese #language-Occitan (post 1500) #language-Old Cornish #language-Atzingo Matlatzinca #language-Old Dutch #language-Odual #language-Ofo #language-Old Frisian #language-Efutop #language-Ogbia #language-Ogbah #language-Old Georgian #language-Khana #language-Ogbronuagum #language-Old Hittite #language-Oirata #language-Northwestern Ojibwa #language-Central Ojibwa #language-Eastern Ojibwa #language-Ojibwa #language-Old Japanese #language-Severn Ojibwa #language-Ontong Java #language-Western Ojibwa #language-Okanagan #language-Okobo #language-Okodia #language-Okpe (Southwestern Edo) #language-Okiek #language-Oko-Juwoi #language-Oki-No-Erabu #language-Old Korean (3rd-9th cent.) #language-Kirike #language-Oko-Eni-Osayen #language-Oku #language-Orokaiva #language-Walungge #language-Mochi #language-Olekha #language-Oloma #language-Livvi #language-Old Lithuanian #language-Omaha-Ponca #language-East Ambae #language-Mochica #language-Omagua #language-Omok #language-Ombo #language-Utarmbung #language-Old Marathi #language-Omotik #language-Omurano #language-South Tairora #language-Old Mon #language-Ona #language-Lingao #language-Oneida #language-Olo #language-Onin #language-Onjob #language-Onobasulu #language-Onondaga #language-Sartang #language-Ono #language-Unua #language-Old Nubian #language-Tohono O'odham #language-Ong #language-Önge #language-Okpamheri #language-Kopkaka #language-Oksapmin #language-Opao #language-Opata #language-Ofayé #language-Oroha #language-Orma #language-Orejón #language-Oring #language-Oroqen #language-Oriya (macrolanguage) #language-Oromo #language-Orokolo #language-Oruma #language-Adivasi Oriya #language-Ormuri #language-Old Russian #language-Oro Win #language-Oro #language-Odia #language-Ormu #language-Osage #language-Oscan #language-Osing #language-Old Spanish #language-Ossetian #language-Old Saxon #language-Ottoman Turkish (1500-1928) #language-Ot Danum #language-Mezquital Otomi #language-Oti #language-Old Turkish #language-Tilapa Otomi #language-Eastern Highland Otomi #language-Querétaro Otomi #language-Otoro #language-Estado de México Otomi #language-Temoaya Otomi #language-Otuke #language-Ottawa #language-Ixtenco Otomi #language-Tagargrent #language-Glio-Oubi #language-Ouma #language-Elfdalian #language-Owiniga #language-Old Welsh #language-Oy #language-Oyda #language-Wayampi #language-Koonzime #language-Parecís #language-Pacoh #language-Paumarí #language-Pagibete #language-Paranawát #language-Pangasinan #language-Tenharim #language-Pe #language-Parakanã #language-Pahlavi #language-Pampanga #language-Panjabi #language-Northern Paiute #language-Papiamento #language-Parya #language-Panamint #language-Papasena #language-Palauan #language-Pakaásnovos #language-Pawnee #language-Pech #language-Pankararú #language-Páez #language-Mezontla Popoloca #language-Coyotepec Popoloca #language-Paraujano #language-E'ñapa Woromaipu #language-Parkwa #language-Kpasam #language-Papel #language-Badyara #language-Pangwa #language-Central Pame #language-Southern Pashto #language-Northern Pashto #language-Pnar #language-Pyu (Papua New Guinea) #language-Santa Inés Ahuatempan Popoloca #language-Pear #language-Bouyei #language-Picard #language-Ruching Palaung #language-Duruwa #language-Parenga #language-Paite Chin #language-Nigerian Pidgin #language-Piti #language-Pacahuara #language-Anam #language-Pennsylvania German #language-Padoe #language-Plautdietsch #language-Kayan #language-Eastern Pomo #language-Mala (Papua New Guinea) #language-Taje #language-Northeastern Pomo #language-Pengo #language-Bonan #language-Chichimeca-Jonaz #language-Northern Pomo #language-Penchal #language-Phende #language-Old Persian (ca. 600-400 B.C.) #language-Kunja #language-Southern Pomo #language-Iranian Persian #language-Petats #language-Eastern Penan #language-Pere #language-Pfaelzisch #language-Sudanese Creole Arabic #language-Gāndhārī #language-Rerep #language-Paelignian #language-Pagu #language-Pa-Hng #language-Phuong #language-Phake #language-Phalura #language-Phoenician #language-Phunoi #language-Phu Thai #language-Pima Bajo #language-Yine #language-Pinji #language-Piaroa #language-Piro #language-Pingelapese #language-Pitcairn-Norfolk #language-Pijao #language-Yom #language-Powhatan #language-Piame #language-Piapoco #language-Pero #language-Piratapuyo #language-Pijin #language-Pitta Pitta #language-Pintupi-Luritja #language-Pileni #language-Pimbwe #language-Piu #language-Pije #language-Pitjantjatjara #language-Ardhamāgadhī Prākrit #language-Pokomo #language-Paekche #language-Pak-Tong #language-Pankhu #language-Pakanha #language-Pökoot #language-Pukapuka #language-Maleng #language-Paku #language-Miani #language-Polonombauk #language-Central Palawano #language-Palu'e #language-Pilagá #language-Paulohi #language-Pali #language-plj #language-Kohistani Shina #language-Shwe Palaung #language-Palenquero #language-Oluta Popoluca #language-Palaic #language-Palaka Senoufo #language-San Marcos Tlacoyalco Popoloca #language-Plateau Malagasy #language-Palikúr #language-Southwest Palawano #language-Brooke's Point Palawano #language-Bolyu #language-Paluan #language-Paama #language-Pambia #language-Pamona #language-Māhārāṣṭri Prākrit #language-Northern Pumi #language-Pamlico #language-Lingua Franca #language-Pomo #language-Pam #language-Pom #language-Northern Pame #language-Paynamar #language-Piemontese #language-Tuamotuan #language-Plains Miwok #language-Papuan Malay #language-Southern Pame #language-Punan Bah-Biau #language-Western Panjabi #language-Western Penan #language-Pangu #language-Penrhyn #language-Aoheng #language-Pinjarup #language-Pinai-Hagahai #language-Panobo #language-Pancana #language-Pana (Burkina Faso) #language-Panim #language-Ponosakan #language-Pontic #language-Jiongnai Bunu #language-Pinigura #language-Banyjima #language-Pinyin #language-Poqomam #language-San Juan Atzingo Popoloca #language-Poke #language-Potiguára #language-Poqomchi' #language-Highland Popoluca #language-Pokangá #language-Polish #language-Southeastern Pomo #language-Pohnpeian #language-Central Pomo #language-Texistepec Popoluca #language-Portuguese #language-Sayula Popoluca #language-Potawatomi #language-Upper Guinea Crioulo #language-San Felipe Otlaltepec Popoloca #language-Polabian #language-Pogolo #language-Papi #language-Paipai #language-Uma #language-Pipil #language-Papuma #language-Papapana #language-Folopa #language-Pei #language-Pare #language-Papora #language-Pa'a #language-Malecite-Passamaquoddy #language-Parachi #language-Parsi-Dari #language-Principense #language-Paranan #language-Prussian #language-Porohanon #language-Paicî #language-Parauk #language-Kibiri #language-Prasuni #language-Old Provençal (to 1500) #language-Ashéninka Perené #language-Puri #language-Dari #language-Puragi #language-Parawen #language-Purik #language-Asue Awyu #language-Central Malay #language-Southwest Pashai #language-Southeast Pashai #language-Pauserna #language-Panasuan #language-Pasi #language-Kaulong #language-Central Pashto #language-Sauraseni Prākrit #language-Port Sandwich #language-Pai Tavytera #language-Pataxó Hã-Ha-Hãe #language-Pindiini #language-Patani #language-Zo'é #language-Patep #language-Piamatsina #language-Bambam #language-Port Vato #language-Western Highland Purepecha #language-Purum #language-Punan Merap #language-Puelche #language-Puinave #language-Punan Tubu #language-Puma #language-Puoc #language-Pulabu #language-Puquina #language-Puruborá #language-Punu #language-Puluwatese #language-Purisimeño #language-Pawaia #language-Gapapaiwa #language-Patwin #language-Molbog #language-Paiwan #language-Pwo Western Karen #language-Fyam #language-Poyanáwa #language-Puyuma #language-Pyen #language-Quapaw #language-Huallaga Huánuco Quechua #language-K'iche' #language-Quechua #language-Lambayeque Quechua #language-Chimborazo Highland Quichua #language-South Bolivian Quechua #language-Quileute #language-Chachapoyas Quechua #language-North Bolivian Quechua #language-Sipacapense #language-Quinault #language-Southern Pastaza Quechua #language-Santiago del Estero Quichua #language-Sacapulteco #language-Tena Lowland Quichua #language-Yauyos Quechua #language-Ayacucho Quechua #language-Cusco Quechua #language-Ambo-Pasco Quechua #language-Cajamarca Quechua #language-Eastern Apurímac Quechua #language-Imbabura Highland Quichua #language-Cajatambo North Lima Quechua #language-North Junín Quechua #language-Napo Lowland Quechua #language-San Martín Quechua #language-Huaylla Wanca Quechua #language-Queyu #language-Northern Pastaza Quichua #language-Corongo Ancash Quechua #language-Huaylas Ancash Quechua #language-Kuman (Russia) #language-Kwalhioqua-Tlatskanai #language-Chiquián Ancash Quechua #language-Northern Conchucos Ancash Quechua #language-Southern Conchucos Ancash Quechua #language-Puno Quechua #language-Qashqa'i #language-Cañar Highland Quichua #language-Southern Qiang #language-Arequipa-La Unión Quechua #language-Jauja Wanca Quechua #language-Quenya #language-Quiripi #language-Dungmali #language-Camling #language-Rasawa #language-Rade #language-Western Meohang #language-Logooli #language-Rabha #language-Ramoaaina #language-Rajasthani #language-Tulu-Bohuai #language-Ralte #language-Canela #language-Riantana #language-Rao #language-Rapanui #language-Rarotongan #language-Tegali #language-Raute #language-Sampang #language-Rawang #language-Rapa #language-Rahambuu #language-Rumai Palaung #language-Barababaraba #language-Réunion Creole French #language-Rerau #language-Rembong #language-Rejang Kayan #language-Kara (Tanzania) #language-Rejang #language-Rendille #language-Remo #language-Rengao #language-Reshe #language-Reyesano #language-Roria #language-Rangkas #language-Romagnol #language-Resígaro #language-Southern Roglai #language-Rohingya #language-Yahang #language-Tarifit #language-Riang Lang #language-Nyaturu #language-Nungu #language-Ribun #language-Ritharrngu #language-Riung #language-Raji #language-Rajbanshi #language-Rikbaktsa #language-Rakahanga-Manihiki #language-Rakhine #language-Marka #language-Rama #language-Rembarrnga #language-Carpathian Romani #language-Angloromani #language-Kalo Finnish Romani #language-Traveller Norwegian #language-Murkim #language-Lomavren #language-Romkun #language-Baltic Romani #language-Roma #language-Balkan Romani #language-Sinte Romani #language-Rempi #language-Caló #language-Domari #language-Tavringer Romani #language-Romanova #language-Welsh Romani #language-Vlax Romani #language-Ruund #language-Ronga #language-Roon #language-Rongpo #language-Nari Nari #language-Rungwa #language-Tae' #language-Rogo #language-Ronji #language-Rombo #language-Northern Roglai #language-Romansh #language-Romblomanon #language-Romany #language-Romanian #language-Rotokas #language-Kriol #language-Runga #language-Repanbitip #language-Rapting #language-Ririo #language-Waima #language-Romano-Serbian #language-Ratahan #language-Rotuman #language-Rathawi #language-Gungu #language-Rusyn #language-Luguru #language-Roviana #language-Rufiji #language-Che #language-Rundi #language-Istro Romanian #language-Macedo-Romanian #language-Megleno Romanian #language-Russian #language-Rutul #language-Lanas Lobu #language-Rawo #language-Rwa #language-Amba (Uganda) #language-Rawa #language-Marwari (India) #language-Northern Amami-Oshima #language-Yaeyama #language-Central Okinawan #language-Buglere #language-Meskwaki #language-Sandawe #language-Sabanê #language-Safaliba #language-Sango #language-Yakut #language-Sahu #language-Sake #language-Samaritan Aramaic #language-Sanskrit #language-Sause #language-Samburu #language-Saraveca #language-Sasak #language-Santali #language-Saleman #language-Saafi-Saafi #language-Sawi #language-Sa #language-Saya #language-Saurashtra #language-Ngambay #language-Simbo #language-Kele (Papua New Guinea) #language-Southern Samo #language-Saliba #language-Chabu #language-Seget #language-Sori-Harengan #language-Seti #language-Safwa #language-Botolan Sambal #language-Sagala #language-Sabüm #language-Sangu (Tanzania) #language-Sileibi #language-Sembakung Murut #language-Subiya #language-Kimki #language-Stod Bhoti #language-Sabine #language-Simba #language-Chut #language-Dongxiang #language-Sanggau #language-Sri Lankan Creole Malay #language-Sadri #language-Shina #language-Sicilian #language-Scots #language-Hyolmo #language-Sa'och #language-North Slavey #language-Shumcho #language-Sheni #language-Sha #language-Toraja-Sa'dan #language-Sassarese Sardinian #language-Surubu #language-Savi #language-Southern Kurdish #language-Suundi #language-Gallurese Sardinian #language-Bukar-Sadung Bidayuh #language-Sherdukpen #language-Semandang #language-Sened #language-Shuadit #language-Sallands #language-Semai #language-Sechelt #language-Sedang #language-Seneca #language-Cebaara Senoufo #language-Sena #language-Seri #language-Sene #language-Sekani #language-Selkup #language-Nanerigé Sénoufo #language-Suarmin #language-Senara Sénoufo #language-Serrano #language-Koyraboro Senni Songhai #language-Sentani #language-Serui-Laut #language-Nyarafolo Senoufo #language-Sewa Bay #language-Secoya #language-Old Irish (to 900) #language-Kipsigis #language-Surigaonon #language-Segai #language-Shughni #language-Suga #language-Singa #language-Singpho #language-Sangisari #language-Samogitian #language-Brokpake #language-Sebat Bet Gurage #language-Sanglechi #language-Sursurunga #language-Shall-Zwall #language-Ninam #language-Sheko #language-Shua #language-Shoshoni #language-Tachelhit #language-Shatt #language-Shilluk #language-Shendu #language-Shan #language-Shanga #language-Shipibo-Conibo #language-Sala #language-Shi #language-Shuswap #language-Shasta #language-Chadian Arabic #language-Shehri #language-Shwai #language-She #language-Tachawit #language-Syenara Senoufo #language-Akkala Sami #language-Sebop #language-Sidamo #language-Simaa #language-Siamou #language-Paasaal #language-Shom Peng #language-Numbami #language-Sikiana #language-Tumulung Sisaala #language-Mende (Papua New Guinea) #language-Sinhala #language-Sikkimese #language-Sonia #language-Siri #language-Siuslaw #language-Sinagen #language-Siwai #language-Sumau #language-Sivandi #language-Siwi #language-Epena #language-Kildin Sami #language-Pite Sami #language-Assangori #language-Kemi Sami #language-Sajalong #language-Mapun #language-Sindarin #language-Xibe #language-Siar-Lak #language-Senhaja De Srair #language-Ter Sami #language-Ume Sami #language-Shawnee #language-Skagit #language-Saek #language-Ma Manda #language-Southern Sierra Miwok #language-Seke (Vanuatu) #language-Sakirabiá #language-Sakalava Malagasy #language-Sikule #language-Sika #language-Seke (Nepal) #language-Kutong #language-Saraiki #language-Maia #language-Sakata #language-Sakao #language-Skou #language-Skepi Creole Dutch #language-Seko Padang #language-Sikaiana #language-Sekar #language-Sáliba #language-Selungai Murut #language-Southern Puget Sound Salish #language-Lower Silesian #language-Salumá #language-Slovak #language-Salt-Yui #language-Pangutaran Sama #language-Salinan #language-Lamaholot #language-Salar #language-Selaru #language-Slovenian #language-Sialum #language-Selayar #language-Ma'ya #language-Southern Sami #language-Simbari #language-Som #language-Northern Sami #language-Simbali #language-Lule Sami #language-Bolinao #language-Central Sama #language-Inari Sami #language-Samoan #language-Samaritan #language-Samo #language-Simeulue #language-Skolt Sami #language-Sumbawa #language-Samba #language-Semnani #language-Shona #language-Sinaugoro #language-Sindhi #language-Bau Bidayuh #language-Noon #language-Sanga (Democratic Republic of Congo) #language-Sensi #language-Soninke #language-Sangil #language-Siona #language-Snohomish #language-Siane #language-Sangu (Gabon) #language-Sihan #language-South West Bay #language-Senggi #language-Sa'ban #language-Selee #language-Sam #language-Saniyo-Hiyewe #language-Kou #language-Sobei #language-So (Democratic Republic of Congo) #language-Songoora #language-Songomeno #language-Sogdian #language-Aka #language-Sokoro #language-Solos #language-Somali #language-Songe #language-Kanasi #language-Somrai #language-Seeku #language-Southern Sotho #language-Southern Thai #language-Sonsorol #language-Sowanda #language-Temi #language-Spanish #language-Sepa (Indonesia) #language-Sapé #language-Saep #language-Sepa (Papua New Guinea) #language-Saponi #language-Selepet #language-Akukem #language-Sanapaná #language-Spokane #language-Supyire Senoufo #language-Saparua #language-Saposa #language-Sapuan #language-South Picene #language-Sabaot #language-Shama-Sambuga #language-Albanian #language-Susquehannock #language-Sorkhei #language-Sou #language-Siculo Arabic #language-Soqotri #language-Squamish #language-Saruga #language-Sora #language-Logudorese Sardinian #language-Sardinian #language-Nafi #language-Sarikoli #language-Siriano #language-Serudung Murut #language-Isirawa #language-Saramaccan #language-Sranan Tongo #language-Campidanese Sardinian #language-Serbian #language-Sirionó #language-Serer #language-Sarsi #language-Suruí #language-Southern Sorsoganon #language-Serua #language-Sera #language-Shahmirzadi #language-Southern Sama #language-Suba-Simbiti #language-Siroi #language-Balangingi #language-Thao #language-Seimat #language-Sausi #language-Sunam #language-Western Sisaala #language-Semnam #language-Sissano #language-Spanish Sign Language #language-So'a #language-Sô #language-Sinasina #language-Susuami #language-Shark Bay #language-Swati #language-Samberigi #language-Saho #language-Sengseng #language-Liana-Seti #language-Seta #language-Trieng #language-Shelta #language-Bulo Stieng #language-Matya Samo #language-Arammba #language-Stellingwerfs #language-Owa #language-Stoney #language-Southeastern Tepehuan #language-Saterfriesisch #language-Straits Salish #language-Shumashti #language-Samtao #language-Silt'e #language-Satawalese #language-Sulka #language-Suku #language-Western Subanon #language-Suena #language-Suganga #language-Suki #language-Shubi #language-Sukuma #language-Sundanese #language-Tirmaga-Chai Suri #language-Mwaghavul #language-Susu #language-Subtiaba #language-Puroik #language-Sumbwa #language-Sumerian #language-Suyá #language-Sunwar #language-Svan #language-Ulau-Suain #language-Vincentian Creole English #language-Serili #language-Savosavo #language-Maore Comorian #language-Congo Swahili #language-Swedish #language-Sere #language-Swabian #language-Swahili (individual language) #language-Sui #language-Sira #language-Samosa #language-Sawknah #language-Shanenawa #language-Suau #language-Sharwa #language-Seluwasan #language-Sawila #language-Suwawa #language-Shekhawati #language-Sowa #language-Suruahá #language-Sarua #language-Suba #language-Sighu #language-Shuhi #language-Samre #language-Sangir #language-Saaroa #language-Upper Saxon #language-Siang #language-Central Subanen #language-Classical Syriac #language-Seki #language-Sukur #language-Sylheti #language-Maya Samo #language-Senaya #language-Syriac #language-Sinyar #language-Kagate #language-Semelai #language-Ngalum #language-Semaq Beri #language-szd #language-Seze #language-Sengele #language-Silesian #language-Sula #language-Suabo #language-Isu (Fako Division) #language-Sawai #language-Lower Tanana #language-Tabassaran #language-Lowland Tarahumara #language-Tause #language-Tariana #language-Tapirapé #language-Tagoi #language-Tahitian #language-Eastern Tamang #language-Tal #language-Tamil #language-Tangale #language-Yami #language-Taabwa #language-Tamasheq #language-Central Tarahumara #language-Tay Boi #language-Tatar #language-Upper Tanana #language-Tatuyo #language-Atayal #language-Tocho #language-Aikanã #language-Takia #language-Kaki Ae #language-Tanimbili #language-Mandara #language-North Tairora #language-Dharawal #language-Gaam #language-Tiang #language-Calamian Tagbanwa #language-Tboli #language-Tagbu #language-Barro Negro Tunebo #language-Tawala #language-Taworta #language-Tumtum #language-Tanguat #language-Tembo (Kitembo) #language-Tubar #language-Tobo #language-Tagbanwa #language-Kapin #language-Tabaru #language-Ditammari #language-Ticuna #language-Tanacross #language-Datooga #language-Tafi #language-Southern Tutchone #language-Malinaltepec Me'phaa #language-Tamagario #language-Wára #language-Tchitchege #language-Taman (Myanmar) #language-Tanahmerah #language-Tichurong #language-Kaiy #language-Torres Strait Creole #language-T'en #language-Toda #language-Tulu #language-Thado Chin #language-Tagdal #language-Emberá-Tadó #language-Tai Nüa #language-Tiranige Diga Dogon #language-Talieng #language-Western Tamang #language-Thulung #language-Tomadino #language-Tajio #language-Tambas #language-Sur #language-Taruma #language-Tondano #language-Teme #language-Todrah #language-Doutai #language-Tetun Dili #language-Toro #language-Tandroy-Mahafaly Malagasy #language-Tadyawan #language-Temiar #language-Terik #language-Tepo Krumen #language-Huehuetla Tepehua #language-Teke-Tege #language-Tehuelche #language-Torricelli #language-Ibali Teke #language-Telugu #language-Timne #language-Tama (Colombia) #language-Teso #language-Tepecano #language-Temein #language-Tereno #language-Tengger #language-Tetum #language-Soo #language-Teor #language-Tewa (USA) #language-Tennet #language-Tulishi #language-Tanaina #language-Teribe #language-Ternate #language-Tobilung #language-Tigak #language-Ciwogai #language-Tobagonian Creole English #language-Lawunuia #language-Tagin #language-Tajik #language-Tagalog #language-Tandaganon #language-Sudest #language-Tangoa #language-Tring #language-Tareng #language-Nume #language-Central Tagbanwa #language-Tanggu #language-Tagwana Senoufo #language-Tagish #language-Togoyo #language-Thai #language-Kuuk Thaayorre #language-Chitwania Tharu #language-Thangmi #language-Tharaka #language-Dangaura Tharu #language-Aheu #language-Thompson #language-Kochila Tharu #language-Rana Tharu #language-Thakali #language-Tahltan #language-Tahaggart Tamahaq #language-Tayart Tamajeq #language-Tidikelt Tamazight #language-Tira #language-Tifal #language-Tigre #language-Timugon Murut #language-Tiene #language-Tilung #language-Tikar #language-Tillamook #language-Timbe #language-Tindi #language-Teop #language-Trimuris #language-Tiéfo #language-Tigrinya #language-Masadiit Itneg #language-Tinigua #language-Adasen #language-Tiv #language-Tiwi #language-Southern Tiwa #language-Tiruray #language-Tai Hongjin #language-Tunjung #language-Northern Tujia #language-Timucua #language-Southern Tujia #language-Tjurruru #language-Djabwurrung #language-Tukudede #language-Takwane #language-Tesaka Malagasy #language-Tokelau #language-Takelma #language-Toku-No-Shima #language-Tikopia #language-Tee #language-Tsakhur #language-Takestani #language-Upper Necaxa Totonac #language-Teanu #language-Tobelo #language-Yecuatla Totonac #language-Talaud #language-Telefol #language-Tofanma #language-Klingon #language-Tlingit #language-Talinga-Bwisi #language-Taloki #language-Tetela #language-Tolomako #language-Talodi #language-Filomena Mata-Coahuitlán Totonac #language-Tai Loi #language-Talise #language-Tambotalo #language-Sou Nama #language-Tulehu #language-Taliabu #language-Khehek #language-Talysh #language-Tama (Chad) #language-Katbol #language-Tumak #language-Haruai #language-Toba-Maskoy #language-Tutuba #language-Samarokena #language-Tai Thanh #language-Taman (Indonesia) #language-Temoq #language-Tumleo #language-Jewish Babylonian Aramaic (ca. 200-1200 CE) #language-Tima #language-Tasmate #language-Iau #language-Temuan #language-Tami #language-Tamanaku #language-Tacana #language-Tanimuca-Retuarã #language-Maiani #language-Tandia #language-Kwamera #language-Lenakel #language-Tabla #language-North Tanna #language-Toromono #language-Whitesands #language-Taino #language-Ménik #language-Tontemboan #language-Tonsawang #language-Tanema #language-Ten'edn #language-Toba #language-Coyutla Totonac #language-Gizrra #language-Tonga (Nyasa) #language-Gitonga #language-Tonga (Zambia) #language-Tojolabal #language-Tolowa #language-Tombulu #language-Tonga (Tonga Islands) #language-Xicotepec De Juárez Totonac #language-Papantla Totonac #language-Highland Totonac #language-Jemez #language-Tobian #language-Taupota #language-Azoyú Me'phaa #language-Tippera #language-Tarpia #language-Kula #language-Tok Pisin #language-Tapieté #language-Tlacoapa Me'phaa #language-Tampulma #language-Tupinambá #language-Pisaflores Tepehua #language-Tuparí #language-Tlachichilco Tepehua #language-Tampuan #language-tpw #language-Acatepec Me'phaa #language-Trumai #language-Tinputz #language-Tembé #language-Lehali #language-Toaripi #language-Tomoip #language-Tunni #language-Touo #language-Tonkawa #language-Tirahi #language-Copala Triqui #language-Turi #language-Trinidadian Creole English #language-Lishán Didán #language-Turaka #language-Trió #language-Tregami #language-Trinitario #language-Tarao Naga #language-Kok Borok #language-San Martín Itunyoso Triqui #language-Taushiro #language-Chicahuaxtla Triqui #language-Tunggare #language-Turoyo #language-Sediq #language-Torwali #language-Tringgus-Sembaan Bidayuh #language-Torá #language-Tsaangi #language-Tsamai #language-Tswa #language-Tsakonian #language-Tausug #language-Tsuvan #language-Tsimshian #language-Tshangla #language-Tseku #language-Ts'ün-Lao #language-Tswana #language-Tsonga #language-Akei #language-Tsou #language-Tsogo #language-Tsishingini #language-Mubami #language-Purepecha #language-Tutelo #language-Tektiteko #language-Tauade #language-Bwanabwana #language-Tuotomb #language-Tutong #language-Tobati #language-Tooro #language-Totoro #language-Northern Tutchone #language-Towei #language-Tawallammat Tamajaq #language-Tera #language-Northeastern Thai #language-Muslim Tat #language-Torau #language-Titan #language-Long Wat #language-Sikaritai #language-Wiarumus #language-Tübatulabal #language-Mutu #language-Tuxá #language-Tuyuca #language-Central Tunebo #language-Tunia #language-Taulil #language-Tupuri #language-Turkmen #language-Tula #language-Tumbuka #language-Tunica #language-Tucano #language-Tedaga #language-Turkish #language-Tuscarora #language-Tututni #language-Turkana #language-Tuxináwa #language-Tugen #language-Turka #language-Vaghua #language-Tsuvadi #language-Te'un #language-Southeast Ambrym #language-Tuvalu #language-Tela-Masbuar #language-Tidore #language-Taveta #language-Tunen #language-Sedoa #language-Timor Pidgin #language-Twana #language-Western Tawbuid #language-Twents #language-Tewa (Indonesia) #language-Northern Tiwa #language-Tereweng #language-Tai Dón #language-Twi #language-Tawang Monpa #language-Ere #language-Tasawaq #language-Turiwára #language-Termanu #language-Tuwari #language-Tewe #language-Tawoyan #language-Tombonuo #language-Tokharian B #language-Tsetsaut #language-Totoli #language-Tangut #language-Thracian #language-Ikpeng #language-Tomini #language-West Tarangan #language-Toto #language-Tonsea #language-Citak #language-Kayapó #language-Tatana #language-Tanosy Malagasy #language-Tauya #language-Kyanga #language-Teke-Tsaayi #language-Tai Do #language-Kombai #language-Thaypan #language-Tai Daeng #language-Kua #language-Tuvinian #language-Teke-Tyee #language-Tày #language-Tzeltal #language-Tz'utujil #language-Talossan #language-Central Atlas Tamazight #language-Tugun #language-Tzotzil #language-Tabriak #language-Uamué #language-Kuan #language-Tairuma #language-Ubang #language-Ubi #language-Buhi'non Bikol #language-Ubir #language-Umbu-Ungu #language-Ubykh #language-Uda #language-Udihe #language-Udi #language-Ujir #language-Wuzlam #language-Udmurt #language-Uduk #language-Kioko #language-Ufim #language-Ugaritic #language-Ughele #language-Ugong #language-Uhami #language-Damal #language-Uighur #language-Iyive #language-Tanjijili #language-Kaburi #language-Ukuriguma #language-Ukhwejo #language-Kui (India) #language-Ukpe-Bayobiri #language-Ukwa #language-Ukrainian #language-Ukue #language-Kuku #language-Fungwa #language-Ulch #language-Lule #language-Usku #language-Ulithian #language-Meriam Mir #language-Ulumanda' #language-Unserdeutsch #language-Uma' Lung #language-Ulwa #language-Umatilla #language-Umbundu #language-Marrucinian #language-Morrobalama #language-Umon #language-Umotína #language-Umpila #language-Umbugarla #language-Pendau #language-Munsee #language-North Watut #language-Undetermined #language-Uneme #language-Ngarinyin #language-Enawené-Nawé #language-Unami #language-Kurnai #language-Mundari #language-Unde Kaili #language-Uripiv-Wala-Rano-Atchin #language-Urarina #language-Urubú-Kaapor #language-Urningangg #language-Urdu #language-Uru #language-Uradhi #language-Urigina #language-Urhobo #language-Urim #language-Urak Lawoi' #language-Uruangnirin #language-Uru-Pa-In #language-Lehalurup #language-Urat #language-Urumi #language-Uruava #language-Sop #language-Urimo #language-Orya #language-Uru-Eu-Wau-Wau #language-Usarufa #language-Ushojo #language-Usaghade #language-Uspanteco #language-Uya #language-Otank #language-Ute-Southern Paiute #language-Amba (Solomon Islands) #language-Etulo #language-Utu #language-Urum #language-Ura (Vanuatu) #language-U #language-West Uvean #language-Uri #language-Lote #language-Kuku-Uwanh #language-Doko-Uyanga #language-Northern Uzbek #language-Southern Uzbek #language-Vaagri Booli #language-Vale #language-Vafsi #language-Vagla #language-Varhadi-Nagpuri #language-Vai #language-Sekele #language-Vehes #language-Vanimo #language-Valman #language-Vao #language-Huarijio #language-Vasavi #language-Vanuma #language-Wayu #language-Southeast Babar #language-Southwestern Bontok #language-Venetian #language-Veddah #language-Veluws #language-Vemgo-Mabas #language-Venda #language-Ventureño #language-Veps #language-Mom Jango #language-Virgin Islands Creole English #language-Vietnamese #language-Vili #language-Viemo #language-Vilela #language-Vinza #language-Viti #language-Iduna #language-Kariyarra #language-Kujarge #language-Kulisusu #language-Kamakan #language-Kodeoha #language-Korlai Creole Portuguese #language-Kurrama #language-Valpei #language-Vlaams #language-Martuyhunira #language-Barbaram #language-East Masela #language-Mainfränkisch #language-Lungalunga #language-Miwa #language-Makhuwa-Shirima #language-Malgana #language-Mitlatongo Mixtec #language-Soyaltepec Mazatec #language-Marenje #language-Makhuwa #language-Ayautla Mazatec #language-Mazatlán Mazatec #language-Vano #language-Vinmavis #language-Vunapu #language-Volapük #language-Voro #language-Votic #language-Vera'a #language-Võro #language-Varisi #language-Burmbar #language-Vunjo #language-Vute #language-Walla Walla #language-Wab #language-Wasco-Wishram #language-Wamesa #language-Walser #language-Watubela #language-Waffa #language-Wolaytta #language-Wampanoag #language-Wan #language-Wappo #language-Wapishana #language-Wagiman #language-Waray (Philippines) #language-Washo #language-Kaninuwa #language-Waurá #language-Waka #language-Waiwai #language-Watam #language-Wayana #language-Wampur #language-Warao #language-Wabo #language-Waritai #language-Wara #language-Wanda #language-Vwanji #language-Alagwa #language-Waigali #language-Wakhi #language-Wa #language-Warlpiri #language-Wagdi #language-Warnman #language-Wajarri #language-Woi #language-Yanomámi #language-Waci Gbe #language-Wandji #language-Wadaginam #language-Wadjiginy #language-Wewaw #language-Wedau #language-Wergaia #language-Weh #language-Kiunum #language-Wemale #language-Westphalien #language-Weri #language-Cameroon Pidgin #language-Perai #language-Wejewa #language-Yafi #language-Wagaya #language-Wagawaga #language-Wangkangurru #language-Wahgi #language-Waigeo #language-Wirangu #language-Warrgamay #language-Sou Upaa #language-Wahau Kenyah #language-Southern Toussian #language-Wichita #language-Wik Ngathan #language-Wik-Me'anha #language-Minidien #language-Wikalkan #language-Wilawila #language-Wik-Mungkan #language-Ho-Chunk #language-Wiraféd #language-Wiru #language-Vitu #language-Wiyot #language-Waja #language-Warji #language-Kw'adza #language-Wakawaka #language-Walio #language-Mwali Comorian #language-Wolane #language-Kunbarlang #language-Welaun #language-Waioli #language-Wailaki #language-Wali (Sudan) #language-Middle Welsh #language-Walloon #language-Wolio #language-Wailapa #language-Wallisian #language-Wuliwuli #language-Wichí Lhamtés Vejoz #language-Walak #language-Waling #language-Wambaya #language-Wamas #language-Mamaindé #language-Wambule #language-Waima'a #language-Wamin #language-Wom (Papua New Guinea) #language-Wambon #language-Walmajarri #language-Mwani #language-Womo #language-Mokati #language-Wantoat #language-Wandarang #language-Waneci #language-Ndzwani Comorian #language-Wanukaka #language-Wano #language-Wanap #language-Usan #language-Wintu #language-Wanyi #language-Kuwema #language-Wè Northern #language-Wogeo #language-Wolani #language-Woleaian #language-Gambian Wolof #language-Wogamusin #language-Kamang #language-Longto #language-Wolof #language-Wom (Nigeria) #language-Wongo #language-Hanga Hundi #language-Wawonii #language-Maco #language-Waluwarra #language-Warungu #language-Wiradjuri #language-Wariyangga #language-Garrwa #language-Warlmanpa #language-Warumungu #language-Warnang #language-Worrorra #language-Waropen #language-Wardaman #language-Waris #language-Waru #language-Waruna #language-Gugu Warra #language-Merwari #language-Waray (Australia) #language-Warembori #language-Adilabad Gondi #language-Wusi #language-Waskia #language-Owenia #language-Wotapuri-Katarqalai #language-Watiwa #language-Wathawurrung #language-Berta #language-Mewati #language-Wotu #language-Wunambal #language-Wutunhua #language-Silimo #language-Wumbvu #language-Bungu #language-Wutung #language-Wu Chinese #language-Wuvulu-Aua #language-Wulna #language-Wauyai #language-Waama #language-Wetamut #language-Warrwa #language-Wawa #language-Waxianghua #language-Wangaaybuwan-Ngiyambaa #language-Woiwurrung #language-Wymysorys #language-Wayoró #language-Western Fijian #language-Andalusian Arabic #language-Sambe #language-Kachari #language-Adai #language-Kaimbé #language-Ararandewára #language-Kalmyk #language-ǀXam #language-Xamtanga #language-Apalachee #language-Aquitanian #language-Karami #language-Kamas #language-Katawixi #language-Kauwera #language-Xavánte #language-Kawaiisu #language-Bactrian #language-Bigambal #language-Bunganditj #language-Kombio #language-Middle Breton #language-Kenaboi #language-Kambera #language-Kambiwá #language-Celtiberian #language-Cisalpine Gaulish #language-Chemakum #language-Classical Armenian #language-Comecrudo #language-Cotoname #language-Chorasmian #language-Carian #language-Classical Tibetan #language-Coahuilteco #language-Cayuse #language-Dacian #language-Dharuk #language-Edomite #language-Malayic Dayak #language-Eblan #language-Hdi #language-ǁXegwi #language-Kelo #language-Kembayan #language-Epi-Olmec #language-Xerénte #language-Kesawai #language-Xetá #language-Keoru-Ahia #language-Faliscan #language-Galatian #language-Gudang #language-Gabrielino-Fernandeño #language-Dharumbal #language-Hadrami #language-Xhosa #language-Hattic #language-Hurrian #language-Iberian #language-Xiri #language-Illyrian #language-Xinca #language-Xiriâna #language-Xipaya #language-Jaitmatang #language-Kalkoti #language-Northern Nago #language-Kereho #language-Kagoro #language-Mainstream Kenyah #language-Kayan River Kayan #language-Koroni #language-Xakriabá #language-Kumbewaha #language-Kgalagadi #language-Uma' Lasan #language-Kamula #language-Loup B #language-Lycian #language-Lydian #language-Ligurian (Ancient) #language-Loup A #language-Lepontic #language-Lusitanian #language-Cuneiform Luwian #language-Elymian #language-Mbonga #language-Mingrelian #language-Mengaka #language-Kugu-Muminh #language-Ancient Macedonian #language-Manado Malay #language-Manichaean Middle Persian #language-Meroitic #language-Matbat #language-Kamu #language-Antankarana Malagasy #language-Tsimihety Malagasy #language-Mayaguduna #language-Mori Bawah #language-Kanakanabu #language-Middle Mongolian #language-Ngarigu #language-Ngumbarl #language-Northern Kankanay #language-Anglo-Norman #language-Kanashi #language-Nyiyaparli #language-Kenzi #language-Kokoda #language-Soga #language-Xokleng #language-Komo (Sudan) #language-Konkomba #language-Xukurú #language-Kopar #language-Korubo #language-Kowaki #language-Northeastern Tasmanian #language-Oyster Bay Tasmanian #language-Liberia Kpelle #language-Southeast Tasmanian #language-Phrygian #language-Kulina Pano #language-Pumpokol #language-Mohegan-Pequot #language-Parthian #language-Punthamara #language-Punic #language-Northern Tasmanian #language-Puyo #language-Qatabanian #language-Krahô #language-Kreye #language-Krikati-Timbira #language-Arin #language-Aranama-Tamique #language-Marriammu #language-Karawa #language-Sabaean #language-Sambal #language-Scythian #language-Sempan #language-Shamang #language-Sio #language-South Slavey #language-Kasem #language-Silopi #language-Sherpa #language-xss #language-Sanumá #language-Sudovian #language-Saisiyat #language-Alcozauca Mixtec #language-Katcha-Kadugli-Miri #language-Diuxi-Tilantongo Mixtec #language-Ketengban #language-Transalpine Gaulish #language-Magdalena Peñasco Mixtec #language-Tokharian A #language-Thawa #language-Alu Kurumba #language-Betta Kurumba #language-Kunigami #language-Ngunawal #language-Umbrian #language-Kuo #language-Upper Umpqua #language-Urartian #language-Kuthant #language-Kxoe #language-Kamviri #language-Vandalic #language-Vestinian #language-Kwaza #language-Woccon #language-Wadi Wadi #language-Kwegu #language-Wangkumara #language-Written Oirat #language-Wemba Wemba #language-Ke'o #language-Tambora #language-Mayi-Yapi #language-Mayi-Kulan #language-Mayi-Thakurti #language-Yorta Yorta #language-Zhang-Zhung #language-Ancient Zapotec #language-Yaminahua #language-Yuhup #language-Yagua #language-Pumé #language-Yaka (Democratic Republic of Congo) #language-Yámana #language-Yazgulyam #language-Yagnobi #language-Banda-Yangere #language-Yakama #language-Yalunka #language-Yamba #language-Mayangna #language-Yao #language-Yapese #language-Yaqui #language-Yabarana #language-Nugunu (Cameroon) #language-Yambeta #language-Yuwana #language-Yangben #language-Yawalapití #language-Agwagwune #language-Lokaa #language-Yala #language-Yemba #language-West Yugur #language-Yakha #language-Yamphu #language-Hasha #language-Yukuben #language-Yaben #language-Yabaâna #language-Yabong #language-Yawiyo #language-Yaweyuha #language-Lolopo #language-Yucuna #language-Yanda #language-Eastern Yiddish #language-Yidgha #language-Yoidik #language-Ravula #language-Yeniche #language-Yimas #language-Tarok #language-Nyankpa #language-Yetfa #language-Yapunda #language-Yeyi #language-Malyangapa #language-Yagaria #language-Yagwoia #language-Baha Buyang #language-Hlepho Phowa #language-Yinggarda #language-Wusa Nasu #language-Western Yiddish #language-Yidiny #language-Yindjibarndi #language-Yimchungru Naga #language-Yis #language-Axi Yi #language-Yakan #language-Northern Yukaghir #language-Yoke #language-Kap #language-Yasa #language-Yekora #language-Kathu #language-Yakoma #language-Yele #language-Yelogu #language-Angguruk Yali #language-Yil #language-Langnian Buyang #language-Yalarnnga #language-Aribwaung #language-Nyâlayu #language-Yambes #language-Southern Muji #language-Yameo #language-Iamalele #language-Maay #language-Yangum Mon #language-Yamap #language-Yandruwandha #language-Naukan Yupik #language-Yangulam #language-Yana #language-Yendang #language-Yansi #language-Yahuna #language-Yoba #language-Yogad #language-Yonaguni #language-Yokuts #language-Yola #language-Yombe #language-Yongkom #language-Yoruba #language-Yotti #language-Yoron #language-Yoy #language-Phola #language-Phupha #language-Phuza #language-Yerakai #language-Yareba #language-Yaouré #language-Nenets #language-Nhengatu #language-Yerong #language-Yarawata #language-Sani #language-Sirenik Yupik #language-Yessan-Mayo #language-Yucateco #language-Yugambal #language-Yuchi #language-Yue Chinese #language-Havasupai-Walapai-Yavapai #language-Yug #language-Yurutí #language-Karkar-Yuri #language-Yuki #language-Yulu #language-Quechan #language-Bena (Nigeria) #language-Yukpa #language-Yurok #language-Yopno #language-Yau (Morobe Province) #language-Southern Yukaghir #language-East Yugur #language-Yuracare #language-Yawa #language-Yavitero #language-Kalou #language-Yawanawa #language-Wuding-Luquan Yi #language-Yawuru #language-Xishanba Lalo #language-Yawarawarga #language-Yagara #language-Yardliyawarra #language-Yir Yoront #language-Yau (Sandaun Province) #language-E'ma Buyang #language-Sierra de Juárez Zapotec #language-Western Tlacolula Valley Zapotec #language-Ocotlán Zapotec #language-Cajonos Zapotec #language-Yareni Zapotec #language-Ayoquesco Zapotec #language-Zaghawa #language-Isthmus Zapotec #language-Zaramo #language-Zanaki #language-Zauzou #language-Ozolotepec Zapotec #language-Zapotec #language-Aloápam Zapotec #language-Rincón Zapotec #language-Yatzachi Zapotec #language-Mitla Zapotec #language-Zayse-Zergulla #language-Zari #language-Balaibalan #language-Central Berawan #language-East Berawan #language-Blissymbols #language-West Berawan #language-Coatecas Altas Zapotec #language-Central Hongshuihe Zhuang #language-Ngazidja Comorian #language-Zeeuws #language-Zenag #language-Zenaga #language-Kinga #language-Guibei Zhuang #language-Standard Moroccan Tamazight #language-Guibian Zhuang #language-Magori #language-Zhuang #language-Zhaba #language-Dai Zhuang #language-Zhire #language-Nong Zhuang #language-Chinese #language-Zhoa #language-Zia #language-Zimakani #language-Mesme #language-Zinza #language-Zigula #language-Zizilivakan #language-Kaimbulawa #language-zkb #language-Kadu #language-Koguryo #language-Karankawa #language-Kott #language-São Paulo Kaingáng #language-Zakhring #language-Kitan #language-Kaurna #language-Liujiang Zhuang #language-Malay (individual language) #language-Lianshan Zhuang #language-Manda (Australia) #language-Zimba #language-Margany #language-Maridan #language-Mangerr #language-Marti Ke #language-Maridjabin #language-Mandandanyi #language-Matngala #language-Marimanindji #language-Mbangwe #language-Molo #language-Mpuono #language-Maranunggu #language-Maringarr #language-Muruwari #language-Mbariman-Gudhinma #language-Mbo (Democratic Republic of Congo) #language-Bomitaba #language-Mariyedi #language-Mbandja #language-Zande (individual language) #language-Mang #language-Mangas #language-Copainalá Zoque #language-Chimalapa Zoque #language-Zou #language-Tabasco Zoque #language-Rayón Zoque #language-Francisco León Zoque #language-Choapan Zapotec #language-San Pedro Quiatoni Zapotec #language-Santa María Quiegolani Zapotec #language-Lachixío Zapotec #language-Mixtepec Zapotec #language-Santa Inés Yatzechi Zapotec #language-Amatlán Zapotec #language-Zoogocho Zapotec #language-Santiago Xanica Zapotec #language-Coatlán Zapotec #language-Yalálag Zapotec #language-Chichicapan Zapotec #language-Zaniza Zapotec #language-San Baltazar Loxicha Zapotec #language-Texmelucan Zapotec #language-Qiubei Zhuang #language-Záparo #language-Zarphatic #language-Mairasi #language-Sarasira #language-Standard Malay #language-Sukurum #language-Xanaguía Zapotec #language-Loxicha Zapotec #language-Quioquitani-Quierí Zapotec #language-Tilquiapan Zapotec #language-Güilá Zapotec #language-Zeem #language-Tokano #language-Zulu #language-Kumzari #language-Zuni #language-Zay #language-Yongbei Zhuang #language-Yang Zhuang #language-Youjiang Zhuang #language-Yongnan Zhuang #language-Zuojiang Zhuang #license-cc0-1.0 #lexicons #region-us \n",
"# PanLex\n\nJanuary 1, 2024 version of PanLex Language Vocabulary with 24,650,274 rows covering 6,152 languages.",
"# Columns\n- 'vocab': contains the text entry. \n- '639-3': contains the ISO 639-3 languages tags to allow users to filter on the language(s) of their choice.\n- '639-3_english_name': the English language name associated to the code ISO 639-3. \n- 'var_code': contains a code to differentiate language variants. In practice, this is the code 639-3 + a number. If 000, it corresponds to the classic 639-3, if the number is different, it corresponds to a variant (e.g. 'fra-006' corresponds to Québécois according to URL This is a code created by Panlex, and therefore does not follow an international standard. \n- 'english_name_var': the English language name associated to the 'var_code'.",
"# Source\nReaders are invited to consult the URL page, which summarizes all the sources.\n\n> Kamholz, David, Jonathan Pool, and Susan M. Colowick. 2014. \n PanLex: Building a Resource for Panlingual Lexical Translation. \n Proceedings of the 9th International Conference on Language Resources and Evaluation (LREC 2014). \n URL"
] |
7b250df388478627648ca5cf7481207d45642dae |
# Dataset of aster/アステール (Granblue Fantasy)
This is the dataset of aster/アステール (Granblue Fantasy), containing 40 images and their tags.
The core tags of this character are `brown_hair, long_hair, animal_ears, brown_eyes, bangs, hair_ornament, breasts, small_breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 40.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aster_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 24.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aster_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 93 | 51.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aster_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 36.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aster_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 93 | 70.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aster_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aster_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, erune, solo, blush, looking_at_viewer, bare_arms, bare_shoulders, full_body, simple_background, white_background, brown_footwear, crossbow, open_mouth, scarf, smile, boots, white_dress, black_hair, closed_mouth, fur_trim, holding_weapon, sleeveless_dress |
| 1 | 5 |  |  |  |  |  | 1girl, erune, looking_at_viewer, simple_background, solo, white_background, blush, looking_back, hairclip, open_mouth, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | erune | solo | blush | looking_at_viewer | bare_arms | bare_shoulders | full_body | simple_background | white_background | brown_footwear | crossbow | open_mouth | scarf | smile | boots | white_dress | black_hair | closed_mouth | fur_trim | holding_weapon | sleeveless_dress | looking_back | hairclip |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------|:--------------------|:------------|:-----------------|:------------|:--------------------|:-------------------|:-----------------|:-----------|:-------------|:--------|:--------|:--------|:--------------|:-------------|:---------------|:-----------|:-----------------|:-------------------|:---------------|:-----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | | | X | X | | | X | | X | | | | | | | | X | X |
| CyberHarem/aster_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:25:00+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:32:50+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of aster/アステール (Granblue Fantasy)
=========================================
This is the dataset of aster/アステール (Granblue Fantasy), containing 40 images and their tags.
The core tags of this character are 'brown\_hair, long\_hair, animal\_ears, brown\_eyes, bangs, hair\_ornament, breasts, small\_breasts, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
3e7f47494b6e9a816bbf3f51d89993de4847b5c2 |
# Dataset of illnott/イルノート (Granblue Fantasy)
This is the dataset of illnott/イルノート (Granblue Fantasy), containing 51 images and their tags.
The core tags of this character are `breasts, blue_eyes, glasses, long_hair, black_hair, hairband, large_breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 51 | 83.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/illnott_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 51 | 45.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/illnott_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 123 | 98.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/illnott_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 51 | 72.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/illnott_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 123 | 147.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/illnott_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/illnott_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, necklace, ear_piercing, grin, jacket, off_shoulder, shorts |
| 1 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, blue_bikini, ponytail, navel, collarbone, grin, bare_shoulders, fishnets, o-ring, open_clothes, tattoo, thigh_strap, white_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | solo | necklace | ear_piercing | grin | jacket | off_shoulder | shorts | blue_bikini | ponytail | navel | collarbone | bare_shoulders | fishnets | o-ring | open_clothes | tattoo | thigh_strap | white_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:-------|:-----------|:---------------|:-------|:---------|:---------------|:---------|:--------------|:-----------|:--------|:-------------|:-----------------|:-----------|:---------|:---------------|:---------|:--------------|:---------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/illnott_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:25:06+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:35:49+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of illnott/イルノート (Granblue Fantasy)
===========================================
This is the dataset of illnott/イルノート (Granblue Fantasy), containing 51 images and their tags.
The core tags of this character are 'breasts, blue\_eyes, glasses, long\_hair, black\_hair, hairband, large\_breasts, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
830d59cebed23d8a3620c97533f6256b3b5f7c36 |
# Dataset of stella/スーテラ (Granblue Fantasy)
This is the dataset of stella/スーテラ (Granblue Fantasy), containing 55 images and their tags.
The core tags of this character are `animal_ears, short_hair, breasts, hair_ornament, mole, mole_under_mouth, grey_hair, bangs, medium_breasts, brown_hair, grey_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 55 | 49.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stella_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 55 | 35.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stella_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 114 | 65.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stella_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 55 | 45.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stella_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 114 | 81.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stella_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/stella_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, detached_sleeves, erune, looking_at_viewer, solo, arrow_(projectile), bow_(weapon), cape, holding_weapon, black_thighhighs, full_body, fur_trim, thigh_boots, ass, black_footwear, high_heel_boots, scarf, short_dress, zettai_ryouiki, backless_outfit, bare_shoulders, black_dress, from_behind, gloves, looking_back, quiver, smile, standing, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, erune, looking_at_viewer, solo, collarbone, long_sleeves, choker, shorts, smile, bag, belt, boots, brown_eyes, cleavage, closed_mouth, detached_sleeves, frills, full_body, halterneck, hat, off_shoulder, simple_background, socks, standing, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | detached_sleeves | erune | looking_at_viewer | solo | arrow_(projectile) | bow_(weapon) | cape | holding_weapon | black_thighhighs | full_body | fur_trim | thigh_boots | ass | black_footwear | high_heel_boots | scarf | short_dress | zettai_ryouiki | backless_outfit | bare_shoulders | black_dress | from_behind | gloves | looking_back | quiver | smile | standing | white_background | collarbone | long_sleeves | choker | shorts | bag | belt | boots | brown_eyes | cleavage | closed_mouth | frills | halterneck | hat | off_shoulder | simple_background | socks |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------|:--------------------|:-------|:---------------------|:---------------|:-------|:-----------------|:-------------------|:------------|:-----------|:--------------|:------|:-----------------|:------------------|:--------|:--------------|:-----------------|:------------------|:-----------------|:--------------|:--------------|:---------|:---------------|:---------|:--------|:-----------|:-------------------|:-------------|:---------------|:---------|:---------|:------|:-------|:--------|:-------------|:-----------|:---------------|:---------|:-------------|:------|:---------------|:--------------------|:--------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | | | | | | X | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/stella_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:25:07+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:37:29+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of stella/スーテラ (Granblue Fantasy)
=========================================
This is the dataset of stella/スーテラ (Granblue Fantasy), containing 55 images and their tags.
The core tags of this character are 'animal\_ears, short\_hair, breasts, hair\_ornament, mole, mole\_under\_mouth, grey\_hair, bangs, medium\_breasts, brown\_hair, grey\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
c95fa2e120636c7b33d8ccffa74854a5dd811159 |
# Dataset of anna/アンナ (Granblue Fantasy)
This is the dataset of anna/アンナ (Granblue Fantasy), containing 24 images and their tags.
The core tags of this character are `hair_over_one_eye, long_hair, red_hair, very_long_hair, green_eyes, hat, witch_hat, breasts, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 30.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anna_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 17.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anna_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 43 | 31.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anna_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 25.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anna_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 43 | 47.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anna_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/anna_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, one-piece_swimsuit, solo, arm_garter, bridal_garter, frilled_swimsuit, looking_at_viewer, absurdly_long_hair, barefoot, stuffed_cat, bare_shoulders, cat_hair_ornament, flower, parasol, water |
| 1 | 11 |  |  |  |  |  | 1girl, solo, looking_at_viewer, frills, stuffed_cat, holding, simple_background, white_background, witch, black_headwear, candle, black_dress, black_footwear, closed_mouth, full_body, knee_boots, sleeveless_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | one-piece_swimsuit | solo | arm_garter | bridal_garter | frilled_swimsuit | looking_at_viewer | absurdly_long_hair | barefoot | stuffed_cat | bare_shoulders | cat_hair_ornament | flower | parasol | water | frills | holding | simple_background | white_background | witch | black_headwear | candle | black_dress | black_footwear | closed_mouth | full_body | knee_boots | sleeveless_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:-------|:-------------|:----------------|:-------------------|:--------------------|:---------------------|:-----------|:--------------|:-----------------|:--------------------|:---------|:----------|:--------|:---------|:----------|:--------------------|:-------------------|:--------|:-----------------|:---------|:--------------|:-----------------|:---------------|:------------|:-------------|:-------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | | X | | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/anna_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:25:07+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:32:45+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of anna/アンナ (Granblue Fantasy)
======================================
This is the dataset of anna/アンナ (Granblue Fantasy), containing 24 images and their tags.
The core tags of this character are 'hair\_over\_one\_eye, long\_hair, red\_hair, very\_long\_hair, green\_eyes, hat, witch\_hat, breasts, hair\_ornament', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
1253641761c39c91be061ce116a0ca3fb38cb9cb |
This dataset contains manual sentiment annotations within a scientific context, with labels "p," "n," and "o" denoting positive, negative, and others, respectively. The dataset comprises 100 records and was created for evaluating the performance of the sci-sentiment-classify model : https://huggingface.co/puzzz21/sci-sentiment-classify.
| puzzz21/sci-sentiment-annotated-dataset | [
"task_categories:text-classification",
"size_categories:n<1K",
"language:en",
"doi:10.57967/hf/1667",
"region:us"
] | 2024-01-21T17:25:46+00:00 | {"language": ["en"], "size_categories": ["n<1K"], "task_categories": ["text-classification"]} | 2024-01-21T17:33:01+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #size_categories-n<1K #language-English #doi-10.57967/hf/1667 #region-us
|
This dataset contains manual sentiment annotations within a scientific context, with labels "p," "n," and "o" denoting positive, negative, and others, respectively. The dataset comprises 100 records and was created for evaluating the performance of the sci-sentiment-classify model : URL
| [] | [
"TAGS\n#task_categories-text-classification #size_categories-n<1K #language-English #doi-10.57967/hf/1667 #region-us \n"
] |
89494c8fdcc3fa4f97abca0d498e2eb1841d1840 |
# Dataset of de_la_fille/レ・フィーエ (Granblue Fantasy)
This is the dataset of de_la_fille/レ・フィーエ (Granblue Fantasy), containing 33 images and their tags.
The core tags of this character are `long_hair, hair_ornament, twintails, breasts, green_hair, green_eyes, large_breasts, multicolored_hair, drill_hair, gradient_hair, bangs, ahoge, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 33 | 39.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_la_fille_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 33 | 27.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_la_fille_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 65 | 49.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_la_fille_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 33 | 37.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_la_fille_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 65 | 62.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_la_fille_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/de_la_fille_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, elbow_gloves, looking_at_viewer, smile, solo, strapless_dress, white_gloves, detached_collar, blush, simple_background, twin_drills, white_background, white_dress, antenna_hair, upper_body |
| 1 | 8 |  |  |  |  |  | 1girl, solo, gem, cleavage, looking_at_viewer, open_mouth, ring, crystal, green_dress, cape, full_body, navel_cutout, simple_background |
| 2 | 6 |  |  |  |  |  | 1girl, bikini, cleavage, looking_at_viewer, navel, solo, armlet, barefoot, necklace, anklet, blush, bracelet, open_mouth, white_background, bare_shoulders, cup, drink, fruit, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | cleavage | elbow_gloves | looking_at_viewer | smile | solo | strapless_dress | white_gloves | detached_collar | blush | simple_background | twin_drills | white_background | white_dress | antenna_hair | upper_body | gem | open_mouth | ring | crystal | green_dress | cape | full_body | navel_cutout | bikini | navel | armlet | barefoot | necklace | anklet | bracelet | cup | drink | fruit |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------|:---------------|:--------------------|:--------|:-------|:------------------|:---------------|:------------------|:--------|:--------------------|:--------------|:-------------------|:--------------|:---------------|:-------------|:------|:-------------|:-------|:----------|:--------------|:-------|:------------|:---------------|:---------|:--------|:---------|:-----------|:-----------|:---------|:-----------|:------|:--------|:--------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | X | | X | | X | | | | | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | | X | | X | | | | X | X | | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/de_la_fille_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:35:11+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:41:37+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of de\_la\_fille/レ・フィーエ (Granblue Fantasy)
==================================================
This is the dataset of de\_la\_fille/レ・フィーエ (Granblue Fantasy), containing 33 images and their tags.
The core tags of this character are 'long\_hair, hair\_ornament, twintails, breasts, green\_hair, green\_eyes, large\_breasts, multicolored\_hair, drill\_hair, gradient\_hair, bangs, ahoge, earrings', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
d4cf2cf28e72663794ee055fa49f9216ac78180e |
# Dataset of pengi/ペンギー (Granblue Fantasy)
This is the dataset of pengi/ペンギー (Granblue Fantasy), containing 85 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, twintails, goggles_on_head, short_hair, short_twintails, breasts, ahoge, fang, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 85 | 84.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pengi_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 85 | 55.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pengi_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 200 | 116.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pengi_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 85 | 77.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pengi_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 200 | 150.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pengi_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pengi_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 85 |  |  |  |  |  | 1girl, goggles, solo, smile, looking_at_viewer, bare_shoulders, open_mouth, blush, fingerless_gloves, belt, black_gloves, armlet, covered_navel, sweat, short_shorts, one_eye_closed |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | goggles | solo | smile | looking_at_viewer | bare_shoulders | open_mouth | blush | fingerless_gloves | belt | black_gloves | armlet | covered_navel | sweat | short_shorts | one_eye_closed |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------|:--------|:--------------------|:-----------------|:-------------|:--------|:--------------------|:-------|:---------------|:---------|:----------------|:--------|:---------------|:-----------------|
| 0 | 85 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/pengi_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:35:29+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:49:50+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of pengi/ペンギー (Granblue Fantasy)
========================================
This is the dataset of pengi/ペンギー (Granblue Fantasy), containing 85 images and their tags.
The core tags of this character are 'blonde\_hair, blue\_eyes, twintails, goggles\_on\_head, short\_hair, short\_twintails, breasts, ahoge, fang, small\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
3093335017486999c44be52680b1f4317e82a648 |
# Dataset of rosamia/ロザミア (Granblue Fantasy)
This is the dataset of rosamia/ロザミア (Granblue Fantasy), containing 37 images and their tags.
The core tags of this character are `long_hair, black_hair, breasts, hair_between_eyes, purple_hair, brown_eyes, multicolored_hair, yellow_eyes, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 37 | 48.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosamia_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 37 | 28.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosamia_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 84 | 59.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosamia_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 37 | 43.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosamia_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 84 | 82.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosamia_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/rosamia_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, looking_at_viewer, solo, mask, holding, sword, gloves, simple_background, belt, boots, gauntlets |
| 1 | 8 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, mask, penis, looking_at_viewer, nipples, blush, large_breasts, mosaic_censoring, pov, bangs, panties, sweat, bar_censor, breasts_out, clothes_lift, ejaculation, one_eye_covered, open_mouth, simple_background, spread_legs, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | mask | holding | sword | gloves | simple_background | belt | boots | gauntlets | 1boy | hetero | solo_focus | penis | nipples | blush | large_breasts | mosaic_censoring | pov | bangs | panties | sweat | bar_censor | breasts_out | clothes_lift | ejaculation | one_eye_covered | open_mouth | spread_legs | very_long_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------|:----------|:--------|:---------|:--------------------|:-------|:--------|:------------|:-------|:---------|:-------------|:--------|:----------|:--------|:----------------|:-------------------|:------|:--------|:----------|:--------|:-------------|:--------------|:---------------|:--------------|:------------------|:-------------|:--------------|:-----------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/rosamia_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:35:29+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:43:48+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of rosamia/ロザミア (Granblue Fantasy)
==========================================
This is the dataset of rosamia/ロザミア (Granblue Fantasy), containing 37 images and their tags.
The core tags of this character are 'long\_hair, black\_hair, breasts, hair\_between\_eyes, purple\_hair, brown\_eyes, multicolored\_hair, yellow\_eyes, medium\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
b6c50e2a3a94c509cf6c9e8ba3487e02a0e9ba23 |
# Dataset of diola/ジオラ (Granblue Fantasy)
This is the dataset of diola/ジオラ (Granblue Fantasy), containing 79 images and their tags.
The core tags of this character are `long_hair, blonde_hair, hat, breasts, large_breasts, green_eyes, brown_eyes, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 79 | 81.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diola_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 79 | 60.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diola_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 181 | 115.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diola_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 79 | 78.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diola_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 181 | 141.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diola_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/diola_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 43 |  |  |  |  |  | 1girl, solo, looking_at_viewer, bare_shoulders, blush, open_mouth, smile, simple_background, cleavage_cutout, white_background, dress |
| 1 | 12 |  |  |  |  |  | 1boy, 1girl, blush, hetero, penis, solo_focus, censored, nipples, open_mouth, bare_shoulders, cum, paizuri, smile, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | bare_shoulders | blush | open_mouth | smile | simple_background | cleavage_cutout | white_background | dress | 1boy | hetero | penis | solo_focus | censored | nipples | cum | paizuri |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------------|:--------|:-------------|:--------|:--------------------|:------------------|:-------------------|:--------|:-------|:---------|:--------|:-------------|:-----------|:----------|:------|:----------|
| 0 | 43 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | | X | X | X | X | | | X | | X | X | X | X | X | X | X | X |
| CyberHarem/diola_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:35:29+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:49:22+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of diola/ジオラ (Granblue Fantasy)
=======================================
This is the dataset of diola/ジオラ (Granblue Fantasy), containing 79 images and their tags.
The core tags of this character are 'long\_hair, blonde\_hair, hat, breasts, large\_breasts, green\_eyes, brown\_eyes, medium\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
0143b644a75ed3b6c1afd5e7cf326a1d51138c3e |
[Blog](https://sfcompute.com/blog/tiny-narrations) | [GitHub](https://github.com/sfcompute/tinynarrations)

```bash
pip install datasets
```
```python
from datasets import load_dataset
val_split = load_dataset('sfcompute/TinyNarrations', split='validation', streaming=True)
train_split = load_dataset('sfcompute/TinyNarrations', split='train', streaming=True)
```
```python
import torch
wav = torch.from_numpy(next(iter(val_split))['audio']['array']).unsqueeze(0)
```
To load audio ensure you have the following installed:
```bash
pip install librosa soundfile
``` | sfcompute/TinyNarrations | [
"license:other",
"region:us"
] | 2024-01-21T17:37:43+00:00 | {"license": ["other"], "viewer": false, "dataset_info": {"features": [{"name": "path", "dtype": "string"}, {"name": "audio", "dtype": "Audio"}], "config_name": "default", "splits": [{"name": "train", "num_bytes": 783536881667, "num_examples": 89112}, {"name": "validation", "num_bytes": 16526026753, "num_examples": 864}], "download_size": 800062908420}} | 2024-01-24T14:47:25+00:00 | [] | [] | TAGS
#license-other #region-us
|
Blog | GitHub
!Narrator
To load audio ensure you have the following installed:
| [] | [
"TAGS\n#license-other #region-us \n"
] |
48d8b05565669ae1654adfcb8c1aee9f5384ab5f | Taken Jon Durbin's bagel dataset and shuffled the dataset. | Locutusque/bagel-clean-v0.3-shuffled | [
"task_categories:conversational",
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-21T17:39:31+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "task_categories": ["conversational", "question-answering", "text-generation"]} | 2024-01-21T20:04:28+00:00 | [] | [
"en"
] | TAGS
#task_categories-conversational #task_categories-question-answering #task_categories-text-generation #size_categories-100K<n<1M #language-English #license-apache-2.0 #region-us
| Taken Jon Durbin's bagel dataset and shuffled the dataset. | [] | [
"TAGS\n#task_categories-conversational #task_categories-question-answering #task_categories-text-generation #size_categories-100K<n<1M #language-English #license-apache-2.0 #region-us \n"
] |
c11e3aa29e9ce6bde02f760bda423e90b18391cc |
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.11
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.11](https://huggingface.co/andysalerno/openchat-nectar-0.11) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.11",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T17:37:46.856873](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.11/blob/main/results_2024-01-21T17-37-46.856873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6550217713416889,
"acc_stderr": 0.031883973576992575,
"acc_norm": 0.6556984270442959,
"acc_norm_stderr": 0.032539393609809474,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5291674189872531,
"mc2_stderr": 0.015420698178455278
},
"harness|arc:challenge|25": {
"acc": 0.6279863481228669,
"acc_stderr": 0.014124597881844461,
"acc_norm": 0.6621160409556314,
"acc_norm_stderr": 0.01382204792228351
},
"harness|hellaswag|10": {
"acc": 0.6379207329217288,
"acc_stderr": 0.004796193584930072,
"acc_norm": 0.8328022306313483,
"acc_norm_stderr": 0.0037238973056454936
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554963,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829194,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829194
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150878,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150878
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741624,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741624
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.014614465821966348,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.014614465821966348
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.02357688174400572,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.02357688174400572
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4908735332464146,
"acc_stderr": 0.01276810860164001,
"acc_norm": 0.4908735332464146,
"acc_norm_stderr": 0.01276810860164001
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.026917481224377204,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.026917481224377204
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5291674189872531,
"mc2_stderr": 0.015420698178455278
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.6846095526914329,
"acc_stderr": 0.012799353675801838
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_andysalerno__openchat-nectar-0.11 | [
"region:us"
] | 2024-01-21T17:40:07+00:00 | {"pretty_name": "Evaluation run of andysalerno/openchat-nectar-0.11", "dataset_summary": "Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.11](https://huggingface.co/andysalerno/openchat-nectar-0.11) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.11\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T17:37:46.856873](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.11/blob/main/results_2024-01-21T17-37-46.856873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6550217713416889,\n \"acc_stderr\": 0.031883973576992575,\n \"acc_norm\": 0.6556984270442959,\n \"acc_norm_stderr\": 0.032539393609809474,\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5291674189872531,\n \"mc2_stderr\": 0.015420698178455278\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.014124597881844461,\n \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.01382204792228351\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6379207329217288,\n \"acc_stderr\": 0.004796193584930072,\n \"acc_norm\": 0.8328022306313483,\n \"acc_norm_stderr\": 0.0037238973056454936\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554963,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.03021683101150878,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.03021683101150878\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n \"acc_stderr\": 0.013223928616741624,\n \"acc_norm\": 0.8365261813537676,\n \"acc_norm_stderr\": 0.013223928616741624\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n \"acc_stderr\": 0.014614465821966348,\n \"acc_norm\": 0.2569832402234637,\n \"acc_norm_stderr\": 0.014614465821966348\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.02357688174400572,\n \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.02357688174400572\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4908735332464146,\n \"acc_stderr\": 0.01276810860164001,\n \"acc_norm\": 0.4908735332464146,\n \"acc_norm_stderr\": 0.01276810860164001\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.026917481224377204,\n \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.026917481224377204\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5291674189872531,\n \"mc2_stderr\": 0.015420698178455278\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6846095526914329,\n \"acc_stderr\": 0.012799353675801838\n }\n}\n```", "repo_url": "https://huggingface.co/andysalerno/openchat-nectar-0.11", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|arc:challenge|25_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|gsm8k|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hellaswag|10_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T17-37-46.856873.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["**/details_harness|winogrande|5_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T17-37-46.856873.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T17_37_46.856873", "path": ["results_2024-01-21T17-37-46.856873.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T17-37-46.856873.parquet"]}]}]} | 2024-01-21T17:40:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.11
Dataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.11 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T17:37:46.856873(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.11\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.11 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T17:37:46.856873(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.11\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.11 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T17:37:46.856873(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2b2d279e8eb144dd147230a18d5dc3d126249a22 |
# Dataset Card for Evaluation run of freeCS-dot-org/ThetaZero-7B-1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [freeCS-dot-org/ThetaZero-7B-1](https://huggingface.co/freeCS-dot-org/ThetaZero-7B-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_freeCS-dot-org__ThetaZero-7B-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T17:39:05.540734](https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__ThetaZero-7B-1/blob/main/results_2024-01-21T17-39-05.540734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6324574731797497,
"acc_stderr": 0.03252185227458061,
"acc_norm": 0.634882895011124,
"acc_norm_stderr": 0.03317372855494942,
"mc1": 0.45777233782129745,
"mc1_stderr": 0.01744096571248212,
"mc2": 0.6248252798346551,
"mc2_stderr": 0.015549488905596676
},
"harness|arc:challenge|25": {
"acc": 0.628839590443686,
"acc_stderr": 0.014117971901142822,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729122
},
"harness|hellaswag|10": {
"acc": 0.6672973511252739,
"acc_stderr": 0.004702181042215892,
"acc_norm": 0.8569010157339175,
"acc_norm_stderr": 0.0034945810763985386
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469546,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469546
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7290322580645161,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.7290322580645161,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941187,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941187
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917669,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876166,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876166
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.394413407821229,
"acc_stderr": 0.01634538676210397,
"acc_norm": 0.394413407821229,
"acc_norm_stderr": 0.01634538676210397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.01271540484127774,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.01271540484127774
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675606,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555404,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555404
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.02753912288906145,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.02753912288906145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45777233782129745,
"mc1_stderr": 0.01744096571248212,
"mc2": 0.6248252798346551,
"mc2_stderr": 0.015549488905596676
},
"harness|winogrande|5": {
"acc": 0.7987371744277821,
"acc_stderr": 0.01126851997157768
},
"harness|gsm8k|5": {
"acc": 0.558756633813495,
"acc_stderr": 0.01367705947859264
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_freeCS-dot-org__ThetaZero-7B-1 | [
"region:us"
] | 2024-01-21T17:41:26+00:00 | {"pretty_name": "Evaluation run of freeCS-dot-org/ThetaZero-7B-1", "dataset_summary": "Dataset automatically created during the evaluation run of model [freeCS-dot-org/ThetaZero-7B-1](https://huggingface.co/freeCS-dot-org/ThetaZero-7B-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freeCS-dot-org__ThetaZero-7B-1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T17:39:05.540734](https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__ThetaZero-7B-1/blob/main/results_2024-01-21T17-39-05.540734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6324574731797497,\n \"acc_stderr\": 0.03252185227458061,\n \"acc_norm\": 0.634882895011124,\n \"acc_norm_stderr\": 0.03317372855494942,\n \"mc1\": 0.45777233782129745,\n \"mc1_stderr\": 0.01744096571248212,\n \"mc2\": 0.6248252798346551,\n \"mc2_stderr\": 0.015549488905596676\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.628839590443686,\n \"acc_stderr\": 0.014117971901142822,\n \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729122\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6672973511252739,\n \"acc_stderr\": 0.004702181042215892,\n \"acc_norm\": 0.8569010157339175,\n \"acc_norm_stderr\": 0.0034945810763985386\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469546,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469546\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7290322580645161,\n \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.7290322580645161,\n \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941187,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941187\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.013890862162876166,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.013890862162876166\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.394413407821229,\n \"acc_stderr\": 0.01634538676210397,\n \"acc_norm\": 0.394413407821229,\n \"acc_norm_stderr\": 0.01634538676210397\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n \"acc_stderr\": 0.01271540484127774,\n \"acc_norm\": 0.45371577574967403,\n \"acc_norm_stderr\": 0.01271540484127774\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n \"acc_stderr\": 0.03152439186555404,\n \"acc_norm\": 0.7263681592039801,\n \"acc_norm_stderr\": 0.03152439186555404\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45777233782129745,\n \"mc1_stderr\": 0.01744096571248212,\n \"mc2\": 0.6248252798346551,\n \"mc2_stderr\": 0.015549488905596676\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.01126851997157768\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.558756633813495,\n \"acc_stderr\": 0.01367705947859264\n }\n}\n```", "repo_url": "https://huggingface.co/freeCS-dot-org/ThetaZero-7B-1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|arc:challenge|25_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|gsm8k|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hellaswag|10_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T17-39-05.540734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["**/details_harness|winogrande|5_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T17-39-05.540734.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T17_39_05.540734", "path": ["results_2024-01-21T17-39-05.540734.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T17-39-05.540734.parquet"]}]}]} | 2024-01-21T17:41:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of freeCS-dot-org/ThetaZero-7B-1
Dataset automatically created during the evaluation run of model freeCS-dot-org/ThetaZero-7B-1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T17:39:05.540734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of freeCS-dot-org/ThetaZero-7B-1\n\n\n\nDataset automatically created during the evaluation run of model freeCS-dot-org/ThetaZero-7B-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T17:39:05.540734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of freeCS-dot-org/ThetaZero-7B-1\n\n\n\nDataset automatically created during the evaluation run of model freeCS-dot-org/ThetaZero-7B-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T17:39:05.540734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3d6500b47abc7f4633b8a2f19413715c3511b7a5 |
# Cicero's De finibus bonorum et malorum
## Overview
This dataset contains the text from Cicero's "De finibus bonorum et malorum." It is intended for use in comparing different neural network architectures within the "Neural Network Evolution" project.
## Source
Texts are sourced from The Latin Library. [The Latin Library](http://www.thelatinlibrary.com/)
## Preprocessing
The dataset was prepared using a Python script for fetching and processing the text. The script ensures consistent formatting and encoding.
Formatting decision
- remove all text in html header
- remove all html tags and tag bodies for anchors
- remove any references in brackets. ex: [1]
- remove any implicit text additions. ex: [et]
- collapse newlines to two separating paragraphs
- standardize paragraph intent to one tab
See fetch_text.py for details.
## Structure of the Dataset
- Format: Plain text
- Encoding: UTF-8
- Size: Approximately 94,616 words and 638,272 characters across 5 books
## License
This dataset is released under the [Creative Commons Attribution 4.0 International License](https://creativecommons.org/licenses/by/4.0/).
| rl337/cicero-bonorum-et-malorum | [
"task_categories:text-generation",
"language:la",
"region:us"
] | 2024-01-21T17:46:46+00:00 | {"language": ["la"], "task_categories": ["text-generation"], "pretty_name": "Cicero's De finibus bonorum et malorum"} | 2024-01-25T16:54:04+00:00 | [] | [
"la"
] | TAGS
#task_categories-text-generation #language-Latin #region-us
|
# Cicero's De finibus bonorum et malorum
## Overview
This dataset contains the text from Cicero's "De finibus bonorum et malorum." It is intended for use in comparing different neural network architectures within the "Neural Network Evolution" project.
## Source
Texts are sourced from The Latin Library. The Latin Library
## Preprocessing
The dataset was prepared using a Python script for fetching and processing the text. The script ensures consistent formatting and encoding.
Formatting decision
- remove all text in html header
- remove all html tags and tag bodies for anchors
- remove any references in brackets. ex: [1]
- remove any implicit text additions. ex: [et]
- collapse newlines to two separating paragraphs
- standardize paragraph intent to one tab
See fetch_text.py for details.
## Structure of the Dataset
- Format: Plain text
- Encoding: UTF-8
- Size: Approximately 94,616 words and 638,272 characters across 5 books
## License
This dataset is released under the Creative Commons Attribution 4.0 International License.
| [
"# Cicero's De finibus bonorum et malorum",
"## Overview\nThis dataset contains the text from Cicero's \"De finibus bonorum et malorum.\" It is intended for use in comparing different neural network architectures within the \"Neural Network Evolution\" project.",
"## Source\nTexts are sourced from The Latin Library. The Latin Library",
"## Preprocessing\nThe dataset was prepared using a Python script for fetching and processing the text. The script ensures consistent formatting and encoding.\n\nFormatting decision\n- remove all text in html header\n- remove all html tags and tag bodies for anchors\n- remove any references in brackets. ex: [1]\n- remove any implicit text additions. ex: [et]\n- collapse newlines to two separating paragraphs\n- standardize paragraph intent to one tab\n\nSee fetch_text.py for details.",
"## Structure of the Dataset\n- Format: Plain text\n- Encoding: UTF-8\n- Size: Approximately 94,616 words and 638,272 characters across 5 books",
"## License\nThis dataset is released under the Creative Commons Attribution 4.0 International License."
] | [
"TAGS\n#task_categories-text-generation #language-Latin #region-us \n",
"# Cicero's De finibus bonorum et malorum",
"## Overview\nThis dataset contains the text from Cicero's \"De finibus bonorum et malorum.\" It is intended for use in comparing different neural network architectures within the \"Neural Network Evolution\" project.",
"## Source\nTexts are sourced from The Latin Library. The Latin Library",
"## Preprocessing\nThe dataset was prepared using a Python script for fetching and processing the text. The script ensures consistent formatting and encoding.\n\nFormatting decision\n- remove all text in html header\n- remove all html tags and tag bodies for anchors\n- remove any references in brackets. ex: [1]\n- remove any implicit text additions. ex: [et]\n- collapse newlines to two separating paragraphs\n- standardize paragraph intent to one tab\n\nSee fetch_text.py for details.",
"## Structure of the Dataset\n- Format: Plain text\n- Encoding: UTF-8\n- Size: Approximately 94,616 words and 638,272 characters across 5 books",
"## License\nThis dataset is released under the Creative Commons Attribution 4.0 International License."
] |
61b11988c29ac0ea584070edf064bf7a23a38a21 |
# Dataset of lilulu/リルル (Granblue Fantasy)
This is the dataset of lilulu/リルル (Granblue Fantasy), containing 10 images and their tags.
The core tags of this character are `bow, hair_bow, long_hair, pointy_ears, blue_eyes, bangs, yellow_bow, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 11.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilulu_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 7.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilulu_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 25 | 15.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilulu_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 10.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilulu_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 25 | 21.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilulu_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lilulu_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, open_mouth, white_background, simple_background, white_gloves, bare_shoulders, blush, harvin, holding, white_thighhighs, full_body, wand, :d, frilled_dress, idol, one_eye_closed, puffy_short_sleeves, shoes, yellow_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | open_mouth | white_background | simple_background | white_gloves | bare_shoulders | blush | harvin | holding | white_thighhighs | full_body | wand | :d | frilled_dress | idol | one_eye_closed | puffy_short_sleeves | shoes | yellow_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------|:-------------------|:--------------------|:---------------|:-----------------|:--------|:---------|:----------|:-------------------|:------------|:-------|:-----|:----------------|:-------|:-----------------|:----------------------|:--------|:---------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/lilulu_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:47:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:50:18+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of lilulu/リルル (Granblue Fantasy)
========================================
This is the dataset of lilulu/リルル (Granblue Fantasy), containing 10 images and their tags.
The core tags of this character are 'bow, hair\_bow, long\_hair, pointy\_ears, blue\_eyes, bangs, yellow\_bow, grey\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
352a7d5a5e7e99e4ea5c3e9b003928484d3d9810 |
# Dataset of sierokarte/シェロカルテ (Granblue Fantasy)
This is the dataset of sierokarte/シェロカルテ (Granblue Fantasy), containing 13 images and their tags.
The core tags of this character are `pointy_ears, short_hair, blue_eyes, blonde_hair, brown_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 11.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sierokarte_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 8.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sierokarte_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 21 | 15.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sierokarte_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 10.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sierokarte_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 21 | 18.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sierokarte_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sierokarte_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | harvin, 1girl, open_mouth, solo, bag, long_sleeves, looking_at_viewer, shirt, :d, blush, parrot, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | harvin | 1girl | open_mouth | solo | bag | long_sleeves | looking_at_viewer | shirt | :d | blush | parrot | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------|:--------|:-------------|:-------|:------|:---------------|:--------------------|:--------|:-----|:--------|:---------|:--------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/sierokarte_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:47:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:52:15+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of sierokarte/シェロカルテ (Granblue Fantasy)
===============================================
This is the dataset of sierokarte/シェロカルテ (Granblue Fantasy), containing 13 images and their tags.
The core tags of this character are 'pointy\_ears, short\_hair, blue\_eyes, blonde\_hair, brown\_hair, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
58aff75477a67522764cb1cce904a25688c42acf |
# Dataset of daetta/ダエッタ (Granblue Fantasy)
This is the dataset of daetta/ダエッタ (Granblue Fantasy), containing 23 images and their tags.
The core tags of this character are `breasts, horns, brown_eyes, large_breasts, long_hair, brown_hair, pointy_ears, animal_hat, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 16.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daetta_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 13.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daetta_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 48 | 26.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daetta_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 15.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daetta_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 48 | 29.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daetta_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/daetta_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, draph, bell, open_mouth, solo, fur_trim, blush, looking_at_viewer, brown_gloves, simple_background, smile, white_background, pantyhose |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | draph | bell | open_mouth | solo | fur_trim | blush | looking_at_viewer | brown_gloves | simple_background | smile | white_background | pantyhose |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-------------|:-------|:-----------|:--------|:--------------------|:---------------|:--------------------|:--------|:-------------------|:------------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/daetta_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:47:23+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:53:36+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of daetta/ダエッタ (Granblue Fantasy)
=========================================
This is the dataset of daetta/ダエッタ (Granblue Fantasy), containing 23 images and their tags.
The core tags of this character are 'breasts, horns, brown\_eyes, large\_breasts, long\_hair, brown\_hair, pointy\_ears, animal\_hat, hat', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
6a0b0e64716986a6c42a4957492a32f6d2f28b96 |
# Dataset of betor/ヴェトル (Granblue Fantasy)
This is the dataset of betor/ヴェトル (Granblue Fantasy), containing 34 images and their tags.
The core tags of this character are `long_hair, ribbon, hair_ribbon, very_long_hair, drill_hair, yellow_eyes, blue_hair, hairband, bangs, breasts, purple_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 37.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/betor_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 27.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/betor_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 70 | 50.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/betor_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 35.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/betor_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 70 | 64.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/betor_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/betor_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, long_sleeves, brown_eyes, closed_mouth, collarbone, detached_sleeves, blush, cleavage, puffy_sleeves, simple_background, smile, white_background, white_dress |
| 1 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, star_(symbol), bare_shoulders |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | looking_at_viewer | solo | long_sleeves | brown_eyes | closed_mouth | collarbone | detached_sleeves | blush | cleavage | puffy_sleeves | simple_background | smile | white_background | white_dress | star_(symbol) |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------------|:-------|:---------------|:-------------|:---------------|:-------------|:-------------------|:--------|:-----------|:----------------|:--------------------|:--------|:-------------------|:--------------|:----------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | | | | | | | | | | | | X |
| CyberHarem/betor_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T17:47:24+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T17:53:28+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of betor/ヴェトル (Granblue Fantasy)
========================================
This is the dataset of betor/ヴェトル (Granblue Fantasy), containing 34 images and their tags.
The core tags of this character are 'long\_hair, ribbon, hair\_ribbon, very\_long\_hair, drill\_hair, yellow\_eyes, blue\_hair, hairband, bangs, breasts, purple\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
3c14fc9f073a18b47c767138d2866e00580f4b13 |
# Dataset Card for Evaluation run of cloudyu/Venus_DPO_50
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Venus_DPO_50](https://huggingface.co/cloudyu/Venus_DPO_50) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Venus_DPO_50",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T17:50:57.443719](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Venus_DPO_50/blob/main/results_2024-01-21T17-50-57.443719.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6666481748258403,
"acc_stderr": 0.03169619700244902,
"acc_norm": 0.6675473139195528,
"acc_norm_stderr": 0.032342525727742856,
"mc1": 0.5801713586291309,
"mc1_stderr": 0.017277030301775766,
"mc2": 0.7263318450468071,
"mc2_stderr": 0.014889987688937593
},
"harness|arc:challenge|25": {
"acc": 0.689419795221843,
"acc_stderr": 0.013522292098053059,
"acc_norm": 0.7073378839590444,
"acc_norm_stderr": 0.013295916103619427
},
"harness|hellaswag|10": {
"acc": 0.7125074686317466,
"acc_stderr": 0.004516681953879092,
"acc_norm": 0.8846843258315077,
"acc_norm_stderr": 0.0031874975090874207
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.02574806587167328,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.02574806587167328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590177,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643527,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643527
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.01638463841038082,
"acc_norm": 0.4,
"acc_norm_stderr": 0.01638463841038082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.02451819564187933,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.02451819564187933
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.022779719088733396,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.022779719088733396
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4915254237288136,
"acc_stderr": 0.01276840169726906,
"acc_norm": 0.4915254237288136,
"acc_norm_stderr": 0.01276840169726906
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103124,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139968,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139968
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5801713586291309,
"mc1_stderr": 0.017277030301775766,
"mc2": 0.7263318450468071,
"mc2_stderr": 0.014889987688937593
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370632
},
"harness|gsm8k|5": {
"acc": 0.6360879454131918,
"acc_stderr": 0.013252539227966193
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__Venus_DPO_50 | [
"region:us"
] | 2024-01-21T17:53:12+00:00 | {"pretty_name": "Evaluation run of cloudyu/Venus_DPO_50", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/Venus_DPO_50](https://huggingface.co/cloudyu/Venus_DPO_50) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Venus_DPO_50\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T17:50:57.443719](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Venus_DPO_50/blob/main/results_2024-01-21T17-50-57.443719.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6666481748258403,\n \"acc_stderr\": 0.03169619700244902,\n \"acc_norm\": 0.6675473139195528,\n \"acc_norm_stderr\": 0.032342525727742856,\n \"mc1\": 0.5801713586291309,\n \"mc1_stderr\": 0.017277030301775766,\n \"mc2\": 0.7263318450468071,\n \"mc2_stderr\": 0.014889987688937593\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.689419795221843,\n \"acc_stderr\": 0.013522292098053059,\n \"acc_norm\": 0.7073378839590444,\n \"acc_norm_stderr\": 0.013295916103619427\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7125074686317466,\n \"acc_stderr\": 0.004516681953879092,\n \"acc_norm\": 0.8846843258315077,\n \"acc_norm_stderr\": 0.0031874975090874207\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.02574806587167328,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.02574806587167328\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590177,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590177\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643527,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643527\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.02451819564187933,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.02451819564187933\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.022779719088733396,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.022779719088733396\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4915254237288136,\n \"acc_stderr\": 0.01276840169726906,\n \"acc_norm\": 0.4915254237288136,\n \"acc_norm_stderr\": 0.01276840169726906\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103124,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103124\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5801713586291309,\n \"mc1_stderr\": 0.017277030301775766,\n \"mc2\": 0.7263318450468071,\n \"mc2_stderr\": 0.014889987688937593\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370632\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6360879454131918,\n \"acc_stderr\": 0.013252539227966193\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/Venus_DPO_50", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|arc:challenge|25_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|gsm8k|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hellaswag|10_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T17-50-57.443719.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["**/details_harness|winogrande|5_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T17-50-57.443719.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T17_50_57.443719", "path": ["results_2024-01-21T17-50-57.443719.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T17-50-57.443719.parquet"]}]}]} | 2024-01-21T17:53:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cloudyu/Venus_DPO_50
Dataset automatically created during the evaluation run of model cloudyu/Venus_DPO_50 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T17:50:57.443719(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cloudyu/Venus_DPO_50\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Venus_DPO_50 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T17:50:57.443719(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cloudyu/Venus_DPO_50\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Venus_DPO_50 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T17:50:57.443719(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f2f2eafea1837c16e8d67874bab4811a958b4bf9 | # Dataset Card for "math_23k_double_kernel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zhangshuoming/math_23k_double_kernel | [
"region:us"
] | 2024-01-21T17:58:48+00:00 | {"dataset_info": {"features": [{"name": "text", "struct": [{"name": "asm", "dtype": "string"}, {"name": "c", "dtype": "string"}, {"name": "driver", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 21695095, "num_examples": 21104}], "download_size": 1822925, "dataset_size": 21695095}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-21T17:58:53+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "math_23k_double_kernel"
More Information needed | [
"# Dataset Card for \"math_23k_double_kernel\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"math_23k_double_kernel\"\n\nMore Information needed"
] |
277b3928c20f660cbcad221ef73161f2121cd62b |
# Dataset of minigob/ミニゴブ (Granblue Fantasy)
This is the dataset of minigob/ミニゴブ (Granblue Fantasy), containing 12 images and their tags.
The core tags of this character are `pointy_ears, short_hair, bangs, brown_eyes, purple_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 9.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minigob_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minigob_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 15.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minigob_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 8.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minigob_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 17.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minigob_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/minigob_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, hood, looking_at_viewer, solo, blush, holding_staff, white_background, wide_sleeves, simple_background, long_sleeves, white_dress, cloak, full_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hood | looking_at_viewer | solo | blush | holding_staff | white_background | wide_sleeves | simple_background | long_sleeves | white_dress | cloak | full_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------|:--------|:----------------|:-------------------|:---------------|:--------------------|:---------------|:--------------|:--------|:------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/minigob_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T18:02:56+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T18:05:25+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of minigob/ミニゴブ (Granblue Fantasy)
==========================================
This is the dataset of minigob/ミニゴブ (Granblue Fantasy), containing 12 images and their tags.
The core tags of this character are 'pointy\_ears, short\_hair, bangs, brown\_eyes, purple\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
55a9cc4f8ebdac6068c5b82daa01882ef907005f |
# Dataset of leona (Granblue Fantasy)
This is the dataset of leona (Granblue Fantasy), containing 38 images and their tags.
The core tags of this character are `brown_hair, long_hair, hair_ornament, blue_eyes, breasts, bangs, large_breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 38 | 48.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 38 | 28.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 77 | 54.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 38 | 42.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 77 | 78.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/leona_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, smile, solo, blush, bare_shoulders, collarbone, open_mouth, single_braid, black_bikini, braided_ponytail, medium_breasts, navel, white_background |
| 1 | 9 |  |  |  |  |  | 1girl, detached_sleeves, black_thighhighs, solo, holding_weapon, looking_at_viewer, spear, white_background, medium_breasts, simple_background, cleavage, gauntlets, thigh_boots, bare_shoulders, dress, full_body, low_ponytail, open_mouth, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | smile | solo | blush | bare_shoulders | collarbone | open_mouth | single_braid | black_bikini | braided_ponytail | medium_breasts | navel | white_background | detached_sleeves | black_thighhighs | holding_weapon | spear | simple_background | gauntlets | thigh_boots | dress | full_body | low_ponytail |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:--------|:-------|:--------|:-----------------|:-------------|:-------------|:---------------|:---------------|:-------------------|:-----------------|:--------|:-------------------|:-------------------|:-------------------|:-----------------|:--------|:--------------------|:------------|:--------------|:--------|:------------|:---------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | | X | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/leona_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T18:02:56+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T18:13:04+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of leona (Granblue Fantasy)
===================================
This is the dataset of leona (Granblue Fantasy), containing 38 images and their tags.
The core tags of this character are 'brown\_hair, long\_hair, hair\_ornament, blue\_eyes, breasts, bangs, large\_breasts, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
0d1903127712b906d788edac20cbedc39c56273e |
# Dataset of karen/カレン (Granblue Fantasy)
This is the dataset of karen/カレン (Granblue Fantasy), containing 24 images and their tags.
The core tags of this character are `hair_ornament, long_hair, brown_hair, blue_eyes, breasts, braid, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 27.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/karen_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 17.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/karen_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 56 | 35.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/karen_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 25.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/karen_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 56 | 45.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/karen_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/karen_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, smile, solo, thighhighs, gloves, looking_at_viewer, cleavage, plaid_skirt, thigh_boots, sword, pantyshot, white_panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | thighhighs | gloves | looking_at_viewer | cleavage | plaid_skirt | thigh_boots | sword | pantyshot | white_panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-------------|:---------|:--------------------|:-----------|:--------------|:--------------|:--------|:------------|:----------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/karen_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T18:02:56+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T18:07:44+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of karen/カレン (Granblue Fantasy)
=======================================
This is the dataset of karen/カレン (Granblue Fantasy), containing 24 images and their tags.
The core tags of this character are 'hair\_ornament, long\_hair, brown\_hair, blue\_eyes, breasts, braid, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
79d5d5836b5d984bee3d01826d3986f75683fdd8 |
# Dataset of funf/フュンフ (Granblue Fantasy)
This is the dataset of funf/フュンフ (Granblue Fantasy), containing 10 images and their tags.
The core tags of this character are `pointy_ears, long_hair, red_hair, blue_eyes, dark-skinned_female, dark_skin, fangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:---------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 4.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/funf_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 4.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/funf_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 14 | 6.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/funf_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 4.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/funf_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 14 | 6.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/funf_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/funf_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | harvin, 1girl, open_mouth, solo, :d, cape, puffy_pants, black_footwear, black_gloves, boots, looking_at_viewer, black_pants, blush, star_(symbol) |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | harvin | 1girl | open_mouth | solo | :d | cape | puffy_pants | black_footwear | black_gloves | boots | looking_at_viewer | black_pants | blush | star_(symbol) |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------|:--------|:-------------|:-------|:-----|:-------|:--------------|:-----------------|:---------------|:--------|:--------------------|:--------------|:--------|:----------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/funf_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T18:03:07+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T18:06:12+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of funf/フュンフ (Granblue Fantasy)
=======================================
This is the dataset of funf/フュンフ (Granblue Fantasy), containing 10 images and their tags.
The core tags of this character are 'pointy\_ears, long\_hair, red\_hair, blue\_eyes, dark-skinned\_female, dark\_skin, fangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
1f14cae36c7a483b1931200c1aa410e22039df4d |
# Dataset Card for Evaluation run of AA051612/B0121
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051612/B0121](https://huggingface.co/AA051612/B0121) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051612__B0121",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T18:18:25.234696](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051612__B0121/blob/main/results_2024-01-21T18-18-25.234696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.8430914237407806,
"acc_stderr": 0.023516882940577213,
"acc_norm": 0.8532858209761611,
"acc_norm_stderr": 0.02386403647892808,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093904,
"mc2": 0.586289923104453,
"mc2_stderr": 0.015188509759243112
},
"harness|arc:challenge|25": {
"acc": 0.6459044368600683,
"acc_stderr": 0.01397545412275656,
"acc_norm": 0.6834470989761092,
"acc_norm_stderr": 0.01359243151906808
},
"harness|hellaswag|10": {
"acc": 0.657837084246166,
"acc_stderr": 0.004734642167493358,
"acc_norm": 0.8530173272256523,
"acc_norm_stderr": 0.003533649851728465
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.8592592592592593,
"acc_stderr": 0.03004136260951689,
"acc_norm": 0.8592592592592593,
"acc_norm_stderr": 0.03004136260951689
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9210526315789473,
"acc_stderr": 0.02194434281824793,
"acc_norm": 0.9210526315789473,
"acc_norm_stderr": 0.02194434281824793
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.9132075471698113,
"acc_stderr": 0.01732701548925273,
"acc_norm": 0.9132075471698113,
"acc_norm_stderr": 0.01732701548925273
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9583333333333334,
"acc_stderr": 0.016710315802959976,
"acc_norm": 0.9583333333333334,
"acc_norm_stderr": 0.016710315802959976
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.8497109826589595,
"acc_stderr": 0.027248024102430903,
"acc_norm": 0.8497109826589595,
"acc_norm_stderr": 0.027248024102430903
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.851063829787234,
"acc_stderr": 0.023274117848010447,
"acc_norm": 0.851063829787234,
"acc_norm_stderr": 0.023274117848010447
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8482758620689655,
"acc_stderr": 0.029896107594574617,
"acc_norm": 0.8482758620689655,
"acc_norm_stderr": 0.029896107594574617
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.828042328042328,
"acc_stderr": 0.019434186770172772,
"acc_norm": 0.828042328042328,
"acc_norm_stderr": 0.019434186770172772
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9580645161290322,
"acc_stderr": 0.011402733315581105,
"acc_norm": 0.9580645161290322,
"acc_norm_stderr": 0.011402733315581105
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.7980295566502463,
"acc_stderr": 0.028247350122180246,
"acc_norm": 0.7980295566502463,
"acc_norm_stderr": 0.028247350122180246
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776348,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776348
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.9454545454545454,
"acc_stderr": 0.01773281509812168,
"acc_norm": 0.9454545454545454,
"acc_norm_stderr": 0.01773281509812168
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9747474747474747,
"acc_stderr": 0.01117803212271851,
"acc_norm": 0.9747474747474747,
"acc_norm_stderr": 0.01117803212271851
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9896373056994818,
"acc_stderr": 0.007308424386792208,
"acc_norm": 0.9896373056994818,
"acc_norm_stderr": 0.007308424386792208
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.015382418053263056,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.015382418053263056
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.9201680672268907,
"acc_stderr": 0.017605474225715064,
"acc_norm": 0.9201680672268907,
"acc_norm_stderr": 0.017605474225715064
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.6556291390728477,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.6556291390728477,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.944954128440367,
"acc_stderr": 0.009778411055200767,
"acc_norm": 0.944954128440367,
"acc_norm_stderr": 0.009778411055200767
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.8009259259259259,
"acc_stderr": 0.027232298462690225,
"acc_norm": 0.8009259259259259,
"acc_norm_stderr": 0.027232298462690225
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9656862745098039,
"acc_stderr": 0.012776266045095928,
"acc_norm": 0.9656862745098039,
"acc_norm_stderr": 0.012776266045095928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9493670886075949,
"acc_stderr": 0.014271760025370174,
"acc_norm": 0.9493670886075949,
"acc_norm_stderr": 0.014271760025370174
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8878923766816144,
"acc_stderr": 0.021174894206346103,
"acc_norm": 0.8878923766816144,
"acc_norm_stderr": 0.021174894206346103
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9541984732824428,
"acc_stderr": 0.01833528478684204,
"acc_norm": 0.9541984732824428,
"acc_norm_stderr": 0.01833528478684204
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02624319405407388,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02624319405407388
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.9351851851851852,
"acc_stderr": 0.023800937426629216,
"acc_norm": 0.9351851851851852,
"acc_norm_stderr": 0.023800937426629216
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.9570552147239264,
"acc_stderr": 0.015928181929854,
"acc_norm": 0.9570552147239264,
"acc_norm_stderr": 0.015928181929854
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.7232142857142857,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.7232142857142857,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.9320388349514563,
"acc_stderr": 0.02491995914251447,
"acc_norm": 0.9320388349514563,
"acc_norm_stderr": 0.02491995914251447
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9871794871794872,
"acc_stderr": 0.00737009013738534,
"acc_norm": 0.9871794871794872,
"acc_norm_stderr": 0.00737009013738534
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594194,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594194
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.946360153256705,
"acc_stderr": 0.008056911822364874,
"acc_norm": 0.946360153256705,
"acc_norm_stderr": 0.008056911822364874
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.869942196531792,
"acc_stderr": 0.018109391528221358,
"acc_norm": 0.869942196531792,
"acc_norm_stderr": 0.018109391528221358
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8893854748603351,
"acc_stderr": 0.01049016541498617,
"acc_norm": 0.8893854748603351,
"acc_norm_stderr": 0.01049016541498617
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.016240995183674178,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.016240995183674178
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.9035369774919614,
"acc_stderr": 0.016767663560541792,
"acc_norm": 0.9035369774919614,
"acc_norm_stderr": 0.016767663560541792
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.9228395061728395,
"acc_stderr": 0.014847704893944932,
"acc_norm": 0.9228395061728395,
"acc_norm_stderr": 0.014847704893944932
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.7836879432624113,
"acc_stderr": 0.024561720560562796,
"acc_norm": 0.7836879432624113,
"acc_norm_stderr": 0.024561720560562796
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.8083441981747066,
"acc_stderr": 0.010052823804268043,
"acc_norm": 0.8083441981747066,
"acc_norm_stderr": 0.010052823804268043
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.9558823529411765,
"acc_stderr": 0.012474517239446882,
"acc_norm": 0.9558823529411765,
"acc_norm_stderr": 0.012474517239446882
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.9003267973856209,
"acc_stderr": 0.012119053136608481,
"acc_norm": 0.9003267973856209,
"acc_norm_stderr": 0.012119053136608481
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.8090909090909091,
"acc_stderr": 0.03764425585984926,
"acc_norm": 0.8090909090909091,
"acc_norm_stderr": 0.03764425585984926
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8857142857142857,
"acc_stderr": 0.020367976491952145,
"acc_norm": 0.8857142857142857,
"acc_norm_stderr": 0.020367976491952145
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9402985074626866,
"acc_stderr": 0.016753689791525094,
"acc_norm": 0.9402985074626866,
"acc_norm_stderr": 0.016753689791525094
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.96,
"acc_stderr": 0.01969463855669321,
"acc_norm": 0.96,
"acc_norm_stderr": 0.01969463855669321
},
"harness|hendrycksTest-virology|5": {
"acc": 0.6746987951807228,
"acc_stderr": 0.03647168523683226,
"acc_norm": 0.6746987951807228,
"acc_norm_stderr": 0.03647168523683226
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9415204678362573,
"acc_stderr": 0.017996678857280124,
"acc_norm": 0.9415204678362573,
"acc_norm_stderr": 0.017996678857280124
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093904,
"mc2": 0.586289923104453,
"mc2_stderr": 0.015188509759243112
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.01120186274448705
},
"harness|gsm8k|5": {
"acc": 0.5155420773313116,
"acc_stderr": 0.013765829454512893
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AA051612__B0121 | [
"region:us"
] | 2024-01-21T18:20:38+00:00 | {"pretty_name": "Evaluation run of AA051612/B0121", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051612/B0121](https://huggingface.co/AA051612/B0121) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051612__B0121\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T18:18:25.234696](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051612__B0121/blob/main/results_2024-01-21T18-18-25.234696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.8430914237407806,\n \"acc_stderr\": 0.023516882940577213,\n \"acc_norm\": 0.8532858209761611,\n \"acc_norm_stderr\": 0.02386403647892808,\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.017193835812093904,\n \"mc2\": 0.586289923104453,\n \"mc2_stderr\": 0.015188509759243112\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6459044368600683,\n \"acc_stderr\": 0.01397545412275656,\n \"acc_norm\": 0.6834470989761092,\n \"acc_norm_stderr\": 0.01359243151906808\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.657837084246166,\n \"acc_stderr\": 0.004734642167493358,\n \"acc_norm\": 0.8530173272256523,\n \"acc_norm_stderr\": 0.003533649851728465\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.8592592592592593,\n \"acc_stderr\": 0.03004136260951689,\n \"acc_norm\": 0.8592592592592593,\n \"acc_norm_stderr\": 0.03004136260951689\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9210526315789473,\n \"acc_stderr\": 0.02194434281824793,\n \"acc_norm\": 0.9210526315789473,\n \"acc_norm_stderr\": 0.02194434281824793\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.9132075471698113,\n \"acc_stderr\": 0.01732701548925273,\n \"acc_norm\": 0.9132075471698113,\n \"acc_norm_stderr\": 0.01732701548925273\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9583333333333334,\n \"acc_stderr\": 0.016710315802959976,\n \"acc_norm\": 0.9583333333333334,\n \"acc_norm_stderr\": 0.016710315802959976\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.8497109826589595,\n \"acc_stderr\": 0.027248024102430903,\n \"acc_norm\": 0.8497109826589595,\n \"acc_norm_stderr\": 0.027248024102430903\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.851063829787234,\n \"acc_stderr\": 0.023274117848010447,\n \"acc_norm\": 0.851063829787234,\n \"acc_norm_stderr\": 0.023274117848010447\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8482758620689655,\n \"acc_stderr\": 0.029896107594574617,\n \"acc_norm\": 0.8482758620689655,\n \"acc_norm_stderr\": 0.029896107594574617\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.828042328042328,\n \"acc_stderr\": 0.019434186770172772,\n \"acc_norm\": 0.828042328042328,\n \"acc_norm_stderr\": 0.019434186770172772\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9580645161290322,\n \"acc_stderr\": 0.011402733315581105,\n \"acc_norm\": 0.9580645161290322,\n \"acc_norm_stderr\": 0.011402733315581105\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.7980295566502463,\n \"acc_stderr\": 0.028247350122180246,\n \"acc_norm\": 0.7980295566502463,\n \"acc_norm_stderr\": 0.028247350122180246\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776348,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776348\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.9454545454545454,\n \"acc_stderr\": 0.01773281509812168,\n \"acc_norm\": 0.9454545454545454,\n \"acc_norm_stderr\": 0.01773281509812168\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9747474747474747,\n \"acc_stderr\": 0.01117803212271851,\n \"acc_norm\": 0.9747474747474747,\n \"acc_norm_stderr\": 0.01117803212271851\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792208,\n \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792208\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.015382418053263056,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.015382418053263056\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.029045600290616258,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.029045600290616258\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.9201680672268907,\n \"acc_stderr\": 0.017605474225715064,\n \"acc_norm\": 0.9201680672268907,\n \"acc_norm_stderr\": 0.017605474225715064\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.6556291390728477,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.6556291390728477,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.944954128440367,\n \"acc_stderr\": 0.009778411055200767,\n \"acc_norm\": 0.944954128440367,\n \"acc_norm_stderr\": 0.009778411055200767\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.8009259259259259,\n \"acc_stderr\": 0.027232298462690225,\n \"acc_norm\": 0.8009259259259259,\n \"acc_norm_stderr\": 0.027232298462690225\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9656862745098039,\n \"acc_stderr\": 0.012776266045095928,\n \"acc_norm\": 0.9656862745098039,\n \"acc_norm_stderr\": 0.012776266045095928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9493670886075949,\n \"acc_stderr\": 0.014271760025370174,\n \"acc_norm\": 0.9493670886075949,\n \"acc_norm_stderr\": 0.014271760025370174\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8878923766816144,\n \"acc_stderr\": 0.021174894206346103,\n \"acc_norm\": 0.8878923766816144,\n \"acc_norm_stderr\": 0.021174894206346103\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9541984732824428,\n \"acc_stderr\": 0.01833528478684204,\n \"acc_norm\": 0.9541984732824428,\n \"acc_norm_stderr\": 0.01833528478684204\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407388,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407388\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9351851851851852,\n \"acc_stderr\": 0.023800937426629216,\n \"acc_norm\": 0.9351851851851852,\n \"acc_norm_stderr\": 0.023800937426629216\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.9570552147239264,\n \"acc_stderr\": 0.015928181929854,\n \"acc_norm\": 0.9570552147239264,\n \"acc_norm_stderr\": 0.015928181929854\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.7232142857142857,\n \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.7232142857142857,\n \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9320388349514563,\n \"acc_stderr\": 0.02491995914251447,\n \"acc_norm\": 0.9320388349514563,\n \"acc_norm_stderr\": 0.02491995914251447\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9871794871794872,\n \"acc_stderr\": 0.00737009013738534,\n \"acc_norm\": 0.9871794871794872,\n \"acc_norm_stderr\": 0.00737009013738534\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594194,\n \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594194\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.946360153256705,\n \"acc_stderr\": 0.008056911822364874,\n \"acc_norm\": 0.946360153256705,\n \"acc_norm_stderr\": 0.008056911822364874\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.869942196531792,\n \"acc_stderr\": 0.018109391528221358,\n \"acc_norm\": 0.869942196531792,\n \"acc_norm_stderr\": 0.018109391528221358\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8893854748603351,\n \"acc_stderr\": 0.01049016541498617,\n \"acc_norm\": 0.8893854748603351,\n \"acc_norm_stderr\": 0.01049016541498617\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.016240995183674178,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.016240995183674178\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.9035369774919614,\n \"acc_stderr\": 0.016767663560541792,\n \"acc_norm\": 0.9035369774919614,\n \"acc_norm_stderr\": 0.016767663560541792\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.9228395061728395,\n \"acc_stderr\": 0.014847704893944932,\n \"acc_norm\": 0.9228395061728395,\n \"acc_norm_stderr\": 0.014847704893944932\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.7836879432624113,\n \"acc_stderr\": 0.024561720560562796,\n \"acc_norm\": 0.7836879432624113,\n \"acc_norm_stderr\": 0.024561720560562796\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.8083441981747066,\n \"acc_stderr\": 0.010052823804268043,\n \"acc_norm\": 0.8083441981747066,\n \"acc_norm_stderr\": 0.010052823804268043\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.9558823529411765,\n \"acc_stderr\": 0.012474517239446882,\n \"acc_norm\": 0.9558823529411765,\n \"acc_norm_stderr\": 0.012474517239446882\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.9003267973856209,\n \"acc_stderr\": 0.012119053136608481,\n \"acc_norm\": 0.9003267973856209,\n \"acc_norm_stderr\": 0.012119053136608481\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8090909090909091,\n \"acc_stderr\": 0.03764425585984926,\n \"acc_norm\": 0.8090909090909091,\n \"acc_norm_stderr\": 0.03764425585984926\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8857142857142857,\n \"acc_stderr\": 0.020367976491952145,\n \"acc_norm\": 0.8857142857142857,\n \"acc_norm_stderr\": 0.020367976491952145\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9402985074626866,\n \"acc_stderr\": 0.016753689791525094,\n \"acc_norm\": 0.9402985074626866,\n \"acc_norm_stderr\": 0.016753689791525094\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.96,\n \"acc_stderr\": 0.01969463855669321,\n \"acc_norm\": 0.96,\n \"acc_norm_stderr\": 0.01969463855669321\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.6746987951807228,\n \"acc_stderr\": 0.03647168523683226,\n \"acc_norm\": 0.6746987951807228,\n \"acc_norm_stderr\": 0.03647168523683226\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9415204678362573,\n \"acc_stderr\": 0.017996678857280124,\n \"acc_norm\": 0.9415204678362573,\n \"acc_norm_stderr\": 0.017996678857280124\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.017193835812093904,\n \"mc2\": 0.586289923104453,\n \"mc2_stderr\": 0.015188509759243112\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.01120186274448705\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5155420773313116,\n \"acc_stderr\": 0.013765829454512893\n }\n}\n```", "repo_url": "https://huggingface.co/AA051612/B0121", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|arc:challenge|25_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|gsm8k|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hellaswag|10_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T18-18-25.234696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["**/details_harness|winogrande|5_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T18-18-25.234696.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T18_18_25.234696", "path": ["results_2024-01-21T18-18-25.234696.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T18-18-25.234696.parquet"]}]}]} | 2024-01-21T18:20:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AA051612/B0121
Dataset automatically created during the evaluation run of model AA051612/B0121 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T18:18:25.234696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AA051612/B0121\n\n\n\nDataset automatically created during the evaluation run of model AA051612/B0121 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T18:18:25.234696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AA051612/B0121\n\n\n\nDataset automatically created during the evaluation run of model AA051612/B0121 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T18:18:25.234696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9ddbdeec22bfabd9821b14c6584ad9587ed7bffa | ### Model Details
Developed by: Prince Sultan University - Riotu Lab
This dataset is intended for use in natural language processing tasks, particularly for understanding classical Arabic and religious texts, including text analysis, language modeling, and thematic studies.
Primary Users: Researchers and developers in the field of natural language processing, religious studies, and AI, specifically those working with classical Arabic texts.
Out-of-scope Use Cases: This dataset is not intended for predictive modeling that could lead to ethical concerns, such as surveillance or profiling based on religious texts.
Model/Data Specifications
Format: Json
Dataset Size: Contains more than 57K rows
Language: Arabic
### Dataset Structure
Fields:
- sura_number: Integer representing the Surah number in the Quran.-
- Aya_number: Integer representing the Ayah number in the Surah.
- tafseers: Dictionary mapping Tafseer sources to their text for each Ayah:
- Tafseer Name :
1: "التفسير الميسر",
2: "تفسير الجلالين",
3:"تفسير ابن كثير",
4: "تفسير الوسيط لطنطاوي",
5: "تفسير البغوي",
6: "تفسير القرطبي",
7: "تفسير الطبري",
| riotu-lab/Quran-Tafseers | [
"task_categories:question-answering",
"size_categories:10K<n<100K",
"language:ar",
"license:apache-2.0",
"region:us"
] | 2024-01-21T18:22:02+00:00 | {"language": ["ar"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering"], "pretty_name": "Tibyan For Holy Quran "} | 2024-01-26T17:10:45+00:00 | [] | [
"ar"
] | TAGS
#task_categories-question-answering #size_categories-10K<n<100K #language-Arabic #license-apache-2.0 #region-us
| ### Model Details
Developed by: Prince Sultan University - Riotu Lab
This dataset is intended for use in natural language processing tasks, particularly for understanding classical Arabic and religious texts, including text analysis, language modeling, and thematic studies.
Primary Users: Researchers and developers in the field of natural language processing, religious studies, and AI, specifically those working with classical Arabic texts.
Out-of-scope Use Cases: This dataset is not intended for predictive modeling that could lead to ethical concerns, such as surveillance or profiling based on religious texts.
Model/Data Specifications
Format: Json
Dataset Size: Contains more than 57K rows
Language: Arabic
### Dataset Structure
Fields:
- sura_number: Integer representing the Surah number in the Quran.-
- Aya_number: Integer representing the Ayah number in the Surah.
- tafseers: Dictionary mapping Tafseer sources to their text for each Ayah:
- Tafseer Name :
1: "التفسير الميسر",
2: "تفسير الجلالين",
3:"تفسير ابن كثير",
4: "تفسير الوسيط لطنطاوي",
5: "تفسير البغوي",
6: "تفسير القرطبي",
7: "تفسير الطبري",
| [
"### Model Details\n\nDeveloped by: Prince Sultan University - Riotu Lab\n\nThis dataset is intended for use in natural language processing tasks, particularly for understanding classical Arabic and religious texts, including text analysis, language modeling, and thematic studies.\nPrimary Users: Researchers and developers in the field of natural language processing, religious studies, and AI, specifically those working with classical Arabic texts.\nOut-of-scope Use Cases: This dataset is not intended for predictive modeling that could lead to ethical concerns, such as surveillance or profiling based on religious texts.\nModel/Data Specifications\n\nFormat: Json\nDataset Size: Contains more than 57K rows\nLanguage: Arabic",
"### Dataset Structure\n\nFields:\n- sura_number: Integer representing the Surah number in the Quran.-\n- Aya_number: Integer representing the Ayah number in the Surah.\n- tafseers: Dictionary mapping Tafseer sources to their text for each Ayah:\n- Tafseer Name : \n 1: \"التفسير الميسر\",\n 2: \"تفسير الجلالين\",\n 3:\"تفسير ابن كثير\",\n 4: \"تفسير الوسيط لطنطاوي\",\n 5: \"تفسير البغوي\",\n 6: \"تفسير القرطبي\",\n 7: \"تفسير الطبري\","
] | [
"TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-Arabic #license-apache-2.0 #region-us \n",
"### Model Details\n\nDeveloped by: Prince Sultan University - Riotu Lab\n\nThis dataset is intended for use in natural language processing tasks, particularly for understanding classical Arabic and religious texts, including text analysis, language modeling, and thematic studies.\nPrimary Users: Researchers and developers in the field of natural language processing, religious studies, and AI, specifically those working with classical Arabic texts.\nOut-of-scope Use Cases: This dataset is not intended for predictive modeling that could lead to ethical concerns, such as surveillance or profiling based on religious texts.\nModel/Data Specifications\n\nFormat: Json\nDataset Size: Contains more than 57K rows\nLanguage: Arabic",
"### Dataset Structure\n\nFields:\n- sura_number: Integer representing the Surah number in the Quran.-\n- Aya_number: Integer representing the Ayah number in the Surah.\n- tafseers: Dictionary mapping Tafseer sources to their text for each Ayah:\n- Tafseer Name : \n 1: \"التفسير الميسر\",\n 2: \"تفسير الجلالين\",\n 3:\"تفسير ابن كثير\",\n 4: \"تفسير الوسيط لطنطاوي\",\n 5: \"تفسير البغوي\",\n 6: \"تفسير القرطبي\",\n 7: \"تفسير الطبري\","
] |
7791de1fb0bd2d40dc23e1e5bb58547437215603 | ## Dataset Description
The text data (title and abstract) of 164,230 arXiv preprints which are associated with at least one [MSC (mathematical subject classification)](https://en.wikipedia.org/wiki/Mathematics_Subject_Classification) code. Predicting 3-character MSC codes based on the cleaned text (processed title+abstarct) amounts to a multi-label classification task.
## Dataset Structure
- The column `cleaned_text` should be used as the input of the text classification task. This is obtained from processing the text data (titles and abstracts) of math-related preprints.
- The last 531 columns are one-hot encoded MSC classes, and should be used as target variables of the multi-label classification task.
- Other columns are auxiliary:
- `url`) the URL of the preprint (the latest version as of December 2023),
- `title`) the original title,
- `abstract`) the original abstract,
- `primary_category`) the primary [arXiv category](https://arxiv.org/category_taxonomy) (for this data, almost always a category of the math archive, or the mathematical physics archive).
- **Subtask**) Predicting `primary_category` based on `cleaned_text`, a multi-class text classification task with ~30 distinct labels.
## Data Splits
Stratified sampling was used for splitting the data so that the proportions of a target variable among the splits are not very different.
|Dataset |Description |Number of instances |
|---------|------------------|----------------------|
|main.zip |the whole data |164,230 |
|train.zip|the training set |104,675 |
|val.zip |the validation set|18,540 |
|test.zip |the test set |41,015 |
## Data Collection and Cleaning
The details are outlined in this [notebook](https://github.com/FilomKhash/Math-Preprint-Classifier/blob/main/Scarping%20and%20Cleaning%20the%20Data.ipynb).
As for the raw data, with the help of the [arxiv package](https://pypi.org/project/arxiv/), we scraped preprints listed, or cross-listed, under the math archive. This raw data was then processed:
- dropping preprints with an abnormally high number of versions,
- keeping only the last arXiv version,
- dropping preprints whose metadata does not include any MSC class,
- dropping entries with pre-2010 mathematics subject classification convention,
- concatenating abstract and title strings and carrying out the following steps to obtain the `cleaned_text` column:
- removing the LaTeX math environment and URL citations,
- make the text lower case, normalizing accents and removing special characters,
- removing English and some corpus-specific stop words,
- stemming.
## Citation
<https://github.com/FilomKhash/Math-Preprint-Classifier> | FilomKhash/Mathematical_Subject_Class_Prediction | [
"task_categories:text-classification",
"size_categories:100K<n<1M",
"source_datasets:original",
"language:en",
"license:mit",
"region:us"
] | 2024-01-21T18:24:11+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["100K<n<1M"], "source_datasets": ["original"], "task_categories": ["text-classification"], "pretty_name": "MSC", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "train.zip"}, {"split": "validation", "path": "val.zip"}, {"split": "test", "path": "test.zip"}]}]} | 2024-01-28T17:17:15+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #size_categories-100K<n<1M #source_datasets-original #language-English #license-mit #region-us
| Dataset Description
-------------------
The text data (title and abstract) of 164,230 arXiv preprints which are associated with at least one MSC (mathematical subject classification) code. Predicting 3-character MSC codes based on the cleaned text (processed title+abstarct) amounts to a multi-label classification task.
Dataset Structure
-----------------
* The column 'cleaned\_text' should be used as the input of the text classification task. This is obtained from processing the text data (titles and abstracts) of math-related preprints.
* The last 531 columns are one-hot encoded MSC classes, and should be used as target variables of the multi-label classification task.
* Other columns are auxiliary:
+ 'url') the URL of the preprint (the latest version as of December 2023),
+ 'title') the original title,
+ 'abstract') the original abstract,
+ 'primary\_category') the primary arXiv category (for this data, almost always a category of the math archive, or the mathematical physics archive).
* Subtask) Predicting 'primary\_category' based on 'cleaned\_text', a multi-class text classification task with ~30 distinct labels.
Data Splits
-----------
Stratified sampling was used for splitting the data so that the proportions of a target variable among the splits are not very different.
Dataset: URL, Description: the whole data, Number of instances: 164,230
Dataset: URL, Description: the training set, Number of instances: 104,675
Dataset: URL, Description: the validation set, Number of instances: 18,540
Dataset: URL, Description: the test set, Number of instances: 41,015
Data Collection and Cleaning
----------------------------
The details are outlined in this notebook.
As for the raw data, with the help of the arxiv package, we scraped preprints listed, or cross-listed, under the math archive. This raw data was then processed:
* dropping preprints with an abnormally high number of versions,
* keeping only the last arXiv version,
* dropping preprints whose metadata does not include any MSC class,
* dropping entries with pre-2010 mathematics subject classification convention,
* concatenating abstract and title strings and carrying out the following steps to obtain the 'cleaned\_text' column:
+ removing the LaTeX math environment and URL citations,
+ make the text lower case, normalizing accents and removing special characters,
+ removing English and some corpus-specific stop words,
+ stemming.
<URL
| [] | [
"TAGS\n#task_categories-text-classification #size_categories-100K<n<1M #source_datasets-original #language-English #license-mit #region-us \n"
] |
996bebadc7ff42739bbbcbb543df8a701a5a87cc |
# Dataset of nia/니아 (Granblue Fantasy)
This is the dataset of nia/니아 (Granblue Fantasy), containing 333 images and their tags.
The core tags of this character are `long_hair, animal_ears, black_hair, red_eyes, bangs, breasts, hair_between_eyes, earrings, ear_piercing`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 333 | 503.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nia_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 333 | 282.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nia_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 817 | 607.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nia_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 333 | 446.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nia_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 817 | 878.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nia_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nia_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, erune, looking_at_viewer, piercing, solo, jewelry, long_sleeves, black_skirt, simple_background, white_background, bags_under_eyes, parted_lips |
| 1 | 6 |  |  |  |  |  | 1girl, bags_under_eyes, erune, jewelry, solo, upper_body, looking_at_viewer, simple_background, white_background, piercing |
| 2 | 7 |  |  |  |  |  | 1girl, bare_shoulders, braid, erune, large_breasts, looking_at_viewer, solo, black_one-piece_swimsuit, blush, cleavage, official_alternate_costume, collarbone, covered_navel, closed_mouth, simple_background, sitting, thighs |
| 3 | 14 |  |  |  |  |  | 1girl, ass, bare_shoulders, blush, erune, looking_at_viewer, official_alternate_costume, solo, looking_back, butt_crack, medium_breasts, sideboob, water, bikini, from_behind, twin_braids, thighs, smile, white_background, black_one-piece_swimsuit, simple_background, wet |
| 4 | 9 |  |  |  |  |  | 1girl, erune, looking_at_viewer, solo, black_gloves, black_dress, hair_flower, blue_rose, long_sleeves, petals, puffy_sleeves, simple_background, smile, white_background |
| 5 | 8 |  |  |  |  |  | 1girl, bare_shoulders, blue_dress, blue_flower, detached_sleeves, erune, hair_flower, solo, blush, looking_at_viewer, sleeveless_dress, very_long_hair, belt, collarbone, medium_breasts, crying_with_eyes_open, puffy_short_sleeves, white_background, bridal_gauntlets, choker, closed_mouth, hand_up, heart, smile, upper_body |
| 6 | 11 |  |  |  |  |  | 1girl, blush, erune, large_breasts, jewelry, nipples, 1boy, hetero, solo_focus, pussy, smile, bar_censor, looking_at_viewer, open_mouth, penis, breasts_out, sweat, completely_nude, female_pubic_hair, on_back |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | erune | looking_at_viewer | piercing | solo | jewelry | long_sleeves | black_skirt | simple_background | white_background | bags_under_eyes | parted_lips | upper_body | bare_shoulders | braid | large_breasts | black_one-piece_swimsuit | blush | cleavage | official_alternate_costume | collarbone | covered_navel | closed_mouth | sitting | thighs | ass | looking_back | butt_crack | medium_breasts | sideboob | water | bikini | from_behind | twin_braids | smile | wet | black_gloves | black_dress | hair_flower | blue_rose | petals | puffy_sleeves | blue_dress | blue_flower | detached_sleeves | sleeveless_dress | very_long_hair | belt | crying_with_eyes_open | puffy_short_sleeves | bridal_gauntlets | choker | hand_up | heart | nipples | 1boy | hetero | solo_focus | pussy | bar_censor | open_mouth | penis | breasts_out | sweat | completely_nude | female_pubic_hair | on_back |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-----------|:-------|:----------|:---------------|:--------------|:--------------------|:-------------------|:------------------|:--------------|:-------------|:-----------------|:--------|:----------------|:---------------------------|:--------|:-----------|:-----------------------------|:-------------|:----------------|:---------------|:----------|:---------|:------|:---------------|:-------------|:-----------------|:-----------|:--------|:---------|:--------------|:--------------|:--------|:------|:---------------|:--------------|:--------------|:------------|:---------|:----------------|:-------------|:--------------|:-------------------|:-------------------|:-----------------|:-------|:------------------------|:----------------------|:-------------------|:---------|:----------|:--------|:----------|:-------|:---------|:-------------|:--------|:-------------|:-------------|:--------|:--------------|:--------|:------------------|:--------------------|:----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | | | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | | X | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | X | X | | X | | | | X | X | | | | X | | | X | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | X | | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | X | | X | | | | | X | | | X | X | | | | X | | | X | | X | | | | | | X | | | | | | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 6 | 11 |  |  |  |  |  | X | X | X | | | X | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/nia_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T18:30:15+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T19:46:07+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of nia/니아 (Granblue Fantasy)
====================================
This is the dataset of nia/니아 (Granblue Fantasy), containing 333 images and their tags.
The core tags of this character are 'long\_hair, animal\_ears, black\_hair, red\_eyes, bangs, breasts, hair\_between\_eyes, earrings, ear\_piercing', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
900d1f0266f7870bc4e692032698720638baf92a |
# Dataset of zooey (Granblue Fantasy)
This is the dataset of zooey (Granblue Fantasy), containing 500 images and their tags.
The core tags of this character are `dark_skin, long_hair, dark-skinned_female, white_hair, red_eyes, hair_between_eyes, ahoge, breasts, very_long_hair, medium_breasts, hair_ornament, hair_flower, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 670.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zooey_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 404.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zooey_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1196 | 844.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zooey_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 603.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zooey_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1196 | 1.12 GiB | [Download](https://huggingface.co/datasets/CyberHarem/zooey_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/zooey_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, collarbone, official_alternate_costume, solo, white_bikini, bare_shoulders, cleavage, looking_at_viewer, front-tie_bikini_top, hibiscus, blush, navel, simple_background, open_mouth, white_background, upper_body, :d, dragon |
| 1 | 14 |  |  |  |  |  | 1girl, armored_dress, blue_dress, solo, bare_shoulders, looking_at_viewer, breastplate, smile, sword, thighhighs, black_gloves, simple_background, white_background, blush, dragon, open_mouth, shield |
| 2 | 21 |  |  |  |  |  | 1girl, armored_dress, solo, breastplate, holding_sword, bare_shoulders, blue_dress, thighhighs, looking_at_viewer, shield, boots, black_gloves, simple_background, dragon, short_dress, white_background, full_body |
| 3 | 9 |  |  |  |  |  | 1boy, 1girl, blush, female_pubic_hair, hetero, nipples, penis, pussy, sex, solo_focus, vaginal, large_breasts, open_mouth, spread_legs, sweat, bar_censor, navel, clitoris, completely_nude, smile, looking_at_viewer, lying |
| 4 | 7 |  |  |  |  |  | 1girl, ass, looking_at_viewer, solo, anus, blush, nipples, smile, spread_legs, bar_censor, open_mouth, sweat, completely_nude, light_areolae, mosaic_censoring, pussy_juice, shiny_skin, spread_pussy |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collarbone | official_alternate_costume | solo | white_bikini | bare_shoulders | cleavage | looking_at_viewer | front-tie_bikini_top | hibiscus | blush | navel | simple_background | open_mouth | white_background | upper_body | :d | dragon | armored_dress | blue_dress | breastplate | smile | sword | thighhighs | black_gloves | shield | holding_sword | boots | short_dress | full_body | 1boy | female_pubic_hair | hetero | nipples | penis | pussy | sex | solo_focus | vaginal | large_breasts | spread_legs | sweat | bar_censor | clitoris | completely_nude | lying | ass | anus | light_areolae | mosaic_censoring | pussy_juice | shiny_skin | spread_pussy |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-----------------------------|:-------|:---------------|:-----------------|:-----------|:--------------------|:-----------------------|:-----------|:--------|:--------|:--------------------|:-------------|:-------------------|:-------------|:-----|:---------|:----------------|:-------------|:--------------|:--------|:--------|:-------------|:---------------|:---------|:----------------|:--------|:--------------|:------------|:-------|:--------------------|:---------|:----------|:--------|:--------|:------|:-------------|:----------|:----------------|:--------------|:--------|:-------------|:-----------|:------------------|:--------|:------|:-------|:----------------|:-------------------|:--------------|:-------------|:---------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | | | X | | X | | X | | | X | | X | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 21 |  |  |  |  |  | X | | | X | | X | | X | | | | | X | | X | | | X | X | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | | | | | | X | | | X | X | | X | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | | | | X | | | X | | | X | | | | | | | | X | | | | | | | | | | | | X | | | | | | | X | X | X | | X | | X | X | X | X | X | X | X |
| CyberHarem/zooey_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T18:30:17+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:17:33+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of zooey (Granblue Fantasy)
===================================
This is the dataset of zooey (Granblue Fantasy), containing 500 images and their tags.
The core tags of this character are 'dark\_skin, long\_hair, dark-skinned\_female, white\_hair, red\_eyes, hair\_between\_eyes, ahoge, breasts, very\_long\_hair, medium\_breasts, hair\_ornament, hair\_flower, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
634d566de491d6b99c3ab406629df555660eb512 |
# Dataset of vajra (Granblue Fantasy)
This is the dataset of vajra (Granblue Fantasy), containing 435 images and their tags.
The core tags of this character are `animal_ears, blonde_hair, short_hair, dog_ears, bangs, braid, hair_ornament, breasts, tail, dog_tail, dog_girl, small_breasts, yellow_eyes, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 435 | 715.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vajra_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 435 | 382.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vajra_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1110 | 852.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vajra_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 435 | 620.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vajra_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1110 | 1.23 GiB | [Download](https://huggingface.co/datasets/CyberHarem/vajra_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/vajra_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, detached_sleeves, erune, katana, looking_at_viewer, shimenawa, solo, bare_shoulders, holding_sword, wide_sleeves, sheath, smile, open_mouth, simple_background, white_background, black_pantyhose, blush, pelvic_curtain, japanese_clothes |
| 1 | 7 |  |  |  |  |  | 1girl, detached_sleeves, erune, looking_at_viewer, open_mouth, shimenawa, smile, solo, bare_shoulders, katana, pantyhose, wide_sleeves, blush, japanese_clothes, sheath |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blush, detached_sleeves, erune, fang, looking_at_viewer, open_mouth, shimenawa, solo, wide_sleeves, :d, black_pantyhose, pelvic_curtain, white_background, claw_pose, japanese_clothes, simple_background |
| 3 | 14 |  |  |  |  |  | 1girl, detached_sleeves, erune, looking_at_viewer, shimenawa, solo, wide_sleeves, bare_shoulders, black_pantyhose, blush, open_mouth, simple_background, white_background, japanese_clothes, pelvic_curtain, :d |
| 4 | 6 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, erune, hair_flower, kimono, looking_at_viewer, smile, solo, blush, nail_polish, obi, red_nails, wide_sleeves, open_mouth, simple_background, white_background, claw_pose, petals, upper_body |
| 5 | 8 |  |  |  |  |  | 1girl, bare_shoulders, blush, erune, kimono, looking_at_viewer, obi, solo, wide_sleeves, detached_sleeves, hair_flower, long_sleeves, open_mouth, :d, oil-paper_umbrella |
| 6 | 7 |  |  |  |  |  | 1girl, erune, hair_flower, looking_at_viewer, open_mouth, solo, white_flower, blush, red_neckerchief, upper_body, collarbone, shirt, short_sleeves, white_sailor_collar, ;d, one_eye_closed, simple_background, white_background, :d, claw_pose, thick_eyebrows |
| 7 | 5 |  |  |  |  |  | 1girl, blush, erune, hair_flower, hat, looking_at_viewer, navel, official_alternate_costume, sailor_collar, sleeveless, solo, white_headwear, blue_shorts, short_shorts, simple_background, white_background, :d, bare_shoulders, midriff, open_mouth, white_belt, wrist_scrunchie, thick_eyebrows, white_jacket |
| 8 | 9 |  |  |  |  |  | 1girl, blue_shorts, hair_flower, hat, looking_at_viewer, midriff, navel, official_alternate_costume, open_mouth, sailor_collar, smile, solo, white_headwear, belt, blush, erune, wrist_scrunchie, blue_sky, day, outdoors, water, cloud, short_shorts, sleeveless_jacket, crop_top, one_eye_closed, wading |
| 9 | 6 |  |  |  |  |  | 1girl, blush, erune, hetero, looking_at_viewer, open_mouth, penis, pussy, solo_focus, 1boy, pantyhose, spread_legs, torn_clothes, nipples, sex, sweat, tongue_out, vaginal, bar_censor, on_back, pubic_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | detached_sleeves | erune | katana | looking_at_viewer | shimenawa | solo | bare_shoulders | holding_sword | wide_sleeves | sheath | smile | open_mouth | simple_background | white_background | black_pantyhose | blush | pelvic_curtain | japanese_clothes | pantyhose | fang | :d | claw_pose | hair_flower | kimono | nail_polish | obi | red_nails | petals | upper_body | long_sleeves | oil-paper_umbrella | white_flower | red_neckerchief | collarbone | shirt | short_sleeves | white_sailor_collar | ;d | one_eye_closed | thick_eyebrows | hat | navel | official_alternate_costume | sailor_collar | sleeveless | white_headwear | blue_shorts | short_shorts | midriff | white_belt | wrist_scrunchie | white_jacket | belt | blue_sky | day | outdoors | water | cloud | sleeveless_jacket | crop_top | wading | hetero | penis | pussy | solo_focus | 1boy | spread_legs | torn_clothes | nipples | sex | sweat | tongue_out | vaginal | bar_censor | on_back | pubic_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------|:---------|:--------------------|:------------|:-------|:-----------------|:----------------|:---------------|:---------|:--------|:-------------|:--------------------|:-------------------|:------------------|:--------|:-----------------|:-------------------|:------------|:-------|:-----|:------------|:--------------|:---------|:--------------|:------|:------------|:---------|:-------------|:---------------|:---------------------|:---------------|:------------------|:-------------|:--------|:----------------|:----------------------|:-----|:-----------------|:-----------------|:------|:--------|:-----------------------------|:----------------|:-------------|:-----------------|:--------------|:---------------|:----------|:-------------|:------------------|:---------------|:-------|:-----------|:------|:-----------|:--------|:--------|:--------------------|:-----------|:---------|:---------|:--------|:--------|:-------------|:-------|:--------------|:---------------|:----------|:------|:--------|:-------------|:----------|:-------------|:----------|:-------------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | X | X | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | X | X | X | X | | X | | | X | X | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | X | X | | X | X | X | X | | X | | | X | X | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | | X | | X | X | | X | | X | X | X | X | | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | X | | X | | X | X | | X | | | X | | | | X | | | | | X | | X | X | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | X | | X | | X | | | | | | X | X | X | | X | | | | | X | X | X | | | | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | X | | X | | X | X | | | | | X | X | X | | X | | | | | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | | X | | X | | X | | | | | X | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | X | | X | X | X | X | | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | | X | | X | | | | | | | | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/vajra_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T18:30:38+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:19:14+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of vajra (Granblue Fantasy)
===================================
This is the dataset of vajra (Granblue Fantasy), containing 435 images and their tags.
The core tags of this character are 'animal\_ears, blonde\_hair, short\_hair, dog\_ears, bangs, braid, hair\_ornament, breasts, tail, dog\_tail, dog\_girl, small\_breasts, yellow\_eyes, brown\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
dff5c81d34b6cead7c37224ce9ac1badb984b29b |
# DogLLaMA-small
The dataset was created using the GPT-3.5 Turbo model by OpenAI. The process involved providing a system prompt instructing the model to respond as if it were a dog, conveying translations or responses in a playful and enthusiastic manner. The dataset includes a variety of prompts covering different topics to ensure diversity in the training data.
The code for the same can be found here: https://github.com/pathikg/DogLLaMA
| pathikg/DogLLAMA-small | [
"task_categories:question-answering",
"size_categories:1K < n < 10K",
"region:us"
] | 2024-01-21T18:42:02+00:00 | {"size_categories": ["1K < n < 10K"], "task_categories": ["question-answering"], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 367171, "num_examples": 2864}, {"name": "test", "num_bytes": 41198, "num_examples": 319}], "download_size": 135884, "dataset_size": 408369}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-22T19:44:18+00:00 | [] | [] | TAGS
#task_categories-question-answering #size_categories-1K < n < 10K #region-us
|
# DogLLaMA-small
The dataset was created using the GPT-3.5 Turbo model by OpenAI. The process involved providing a system prompt instructing the model to respond as if it were a dog, conveying translations or responses in a playful and enthusiastic manner. The dataset includes a variety of prompts covering different topics to ensure diversity in the training data.
The code for the same can be found here: URL
| [
"# DogLLaMA-small\n\nThe dataset was created using the GPT-3.5 Turbo model by OpenAI. The process involved providing a system prompt instructing the model to respond as if it were a dog, conveying translations or responses in a playful and enthusiastic manner. The dataset includes a variety of prompts covering different topics to ensure diversity in the training data.\n\nThe code for the same can be found here: URL"
] | [
"TAGS\n#task_categories-question-answering #size_categories-1K < n < 10K #region-us \n",
"# DogLLaMA-small\n\nThe dataset was created using the GPT-3.5 Turbo model by OpenAI. The process involved providing a system prompt instructing the model to respond as if it were a dog, conveying translations or responses in a playful and enthusiastic manner. The dataset includes a variety of prompts covering different topics to ensure diversity in the training data.\n\nThe code for the same can be found here: URL"
] |
4d9c3cd198e46ab824af40091c279200997024bc |
# Dataset of charlotta (Granblue Fantasy)
This is the dataset of charlotta (Granblue Fantasy), containing 302 images and their tags.
The core tags of this character are `blonde_hair, long_hair, pointy_ears, blue_eyes, crown, very_long_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 302 | 251.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotta_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 302 | 179.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotta_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 594 | 335.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotta_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 302 | 233.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotta_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 594 | 408.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotta_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/charlotta_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, blue_dress, breastplate, gauntlets, harvin, holding_sword, solo, simple_background, armored_boots, frilled_dress, v-shaped_eyebrows, white_background, puffy_short_sleeves, open_mouth, armored_dress, full_body |
| 1 | 5 |  |  |  |  |  | 1girl, blue_dress, closed_mouth, gauntlets, harvin, holding_sword, looking_at_viewer, solo, simple_background, smile, white_background, puffy_sleeves, blush, breastplate, hair_between_eyes, v-shaped_eyebrows |
| 2 | 6 |  |  |  |  |  | 1girl, blue_dress, blush, breastplate, harvin, looking_at_viewer, puffy_short_sleeves, solo, simple_background, white_background, gauntlets, v-shaped_eyebrows, open_mouth, smile, upper_body |
| 3 | 7 |  |  |  |  |  | 1girl, armor, dress, harvin, solo, sword, gauntlets, looking_at_viewer, smile, open_mouth |
| 4 | 6 |  |  |  |  |  | 1girl, bat_wings, blush, hair_bow, halloween, harvin, jack-o'-lantern, pumpkin, solo, black_bow, puffy_short_sleeves, armored_boots, breastplate, gauntlets, looking_at_viewer, orange_dress, sword, closed_mouth, frilled_dress, full_body, holding, smile, v-shaped_eyebrows |
| 5 | 12 |  |  |  |  |  | 1girl, harvin, solo, bare_shoulders, blush, collarbone, looking_at_viewer, navel, smile, white_background, hair_between_eyes, closed_mouth, official_alternate_costume, simple_background, full_body, >:), bikini_skirt, standing, star_(symbol), white_bikini |
| 6 | 5 |  |  |  |  |  | 1girl, fake_animal_ears, harvin, leotard, playboy_bunny, rabbit_ears, solo, bare_shoulders, detached_collar, looking_at_viewer, wrist_cuffs, black_pantyhose, blush, bowtie, full_body, rabbit_tail, simple_background, small_breasts, cowboy_shot, grey_background, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_dress | breastplate | gauntlets | harvin | holding_sword | solo | simple_background | armored_boots | frilled_dress | v-shaped_eyebrows | white_background | puffy_short_sleeves | open_mouth | armored_dress | full_body | closed_mouth | looking_at_viewer | smile | puffy_sleeves | blush | hair_between_eyes | upper_body | armor | dress | sword | bat_wings | hair_bow | halloween | jack-o'-lantern | pumpkin | black_bow | orange_dress | holding | bare_shoulders | collarbone | navel | official_alternate_costume | >:) | bikini_skirt | standing | star_(symbol) | white_bikini | fake_animal_ears | leotard | playboy_bunny | rabbit_ears | detached_collar | wrist_cuffs | black_pantyhose | bowtie | rabbit_tail | small_breasts | cowboy_shot | grey_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------|:------------|:---------|:----------------|:-------|:--------------------|:----------------|:----------------|:--------------------|:-------------------|:----------------------|:-------------|:----------------|:------------|:---------------|:--------------------|:--------|:----------------|:--------|:--------------------|:-------------|:--------|:--------|:--------|:------------|:-----------|:------------|:------------------|:----------|:------------|:---------------|:----------|:-----------------|:-------------|:--------|:-----------------------------|:------|:---------------|:-----------|:----------------|:---------------|:-------------------|:----------|:----------------|:--------------|:------------------|:--------------|:------------------|:---------|:--------------|:----------------|:--------------|:------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | | X | X | | | X | X | X | X | | | | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | X | X | | X | | | | | | | X | | | | X | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | X | X | X | | X | | X | X | X | | X | | | X | X | X | X | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | | | | X | | X | X | | | | X | | | | X | X | X | X | | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | X | | X | X | | | | | | X | | X | | X | | | X | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/charlotta_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T18:46:52+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T19:36:50+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of charlotta (Granblue Fantasy)
=======================================
This is the dataset of charlotta (Granblue Fantasy), containing 302 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, pointy\_ears, blue\_eyes, crown, very\_long\_hair, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
b45ef8e3617f46162783ffa8f4a7fe8c8a6d8059 |
# Dataset of ilsa (Granblue Fantasy)
This is the dataset of ilsa (Granblue Fantasy), containing 213 images and their tags.
The core tags of this character are `animal_ears, black_hair, breasts, red_eyes, hair_between_eyes, large_breasts, long_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 213 | 296.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ilsa_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 213 | 171.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ilsa_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 507 | 357.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ilsa_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 213 | 260.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ilsa_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 507 | 510.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ilsa_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ilsa_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, cape, erune, hat, looking_at_viewer, solo, black_gloves, covered_navel, military_uniform, sideboob, simple_background, white_background, blush, long_sleeves, upper_body, closed_mouth, double-breasted, white_headwear |
| 1 | 8 |  |  |  |  |  | 1girl, erune, solo, black_gloves, cape, hat, looking_at_viewer, open_mouth, sideboob, covered_navel, long_sleeves, military_uniform, short_hair, gun, holding, white_pants |
| 2 | 54 |  |  |  |  |  | erune, 1girl, solo, looking_at_viewer, bare_shoulders, cleavage, official_alternate_costume, blush, sunglasses, eyewear_on_head, hair_ribbon, smile, hair_flower, white_one-piece_swimsuit, collarbone |
| 3 | 14 |  |  |  |  |  | 1girl, erune, solo, belt, cleavage, looking_at_viewer, torn_jeans, cape, ear_piercing, single_pantsleg, long_sleeves, high-waist_pants, simple_background, weapon, white_background, black_shirt, clothing_cutout |
| 4 | 5 |  |  |  |  |  | 1girl, erune, looking_at_viewer, solo, blush, nipples, onsen, water, sitting, towel, wet, collarbone, completely_nude, full_moon, jewelry, lips, navel, night, nude_cover, open_mouth, smile, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cape | erune | hat | looking_at_viewer | solo | black_gloves | covered_navel | military_uniform | sideboob | simple_background | white_background | blush | long_sleeves | upper_body | closed_mouth | double-breasted | white_headwear | open_mouth | short_hair | gun | holding | white_pants | bare_shoulders | cleavage | official_alternate_costume | sunglasses | eyewear_on_head | hair_ribbon | smile | hair_flower | white_one-piece_swimsuit | collarbone | belt | torn_jeans | ear_piercing | single_pantsleg | high-waist_pants | weapon | black_shirt | clothing_cutout | nipples | onsen | water | sitting | towel | wet | completely_nude | full_moon | jewelry | lips | navel | night | nude_cover | thighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:------|:--------------------|:-------|:---------------|:----------------|:-------------------|:-----------|:--------------------|:-------------------|:--------|:---------------|:-------------|:---------------|:------------------|:-----------------|:-------------|:-------------|:------|:----------|:--------------|:-----------------|:-----------|:-----------------------------|:-------------|:------------------|:--------------|:--------|:--------------|:---------------------------|:-------------|:-------|:-------------|:---------------|:------------------|:-------------------|:---------|:--------------|:------------------|:----------|:--------|:--------|:----------|:--------|:------|:------------------|:------------|:----------|:-------|:--------|:--------|:-------------|:---------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 54 |  |  |  |  |  | X | | X | | X | X | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | X | X | | X | X | | | | | X | X | | X | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | | X | X | | | | | | | X | | | | | | X | | | | | | | | | | | X | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ilsa_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T18:46:57+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T19:35:50+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of ilsa (Granblue Fantasy)
==================================
This is the dataset of ilsa (Granblue Fantasy), containing 213 images and their tags.
The core tags of this character are 'animal\_ears, black\_hair, breasts, red\_eyes, hair\_between\_eyes, large\_breasts, long\_hair, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
be841a835ec2ae016436a8311dedcc1721340e4d |
# Dataset of kubira (Granblue Fantasy)
This is the dataset of kubira (Granblue Fantasy), containing 306 images and their tags.
The core tags of this character are `dark_skin, blonde_hair, horns, long_hair, dark-skinned_female, pointy_ears, breasts, large_breasts, bangs, yellow_eyes, horn_ornament, multicolored_hair, pink_hair, brown_eyes, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 306 | 456.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kubira_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 306 | 260.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kubira_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 762 | 562.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kubira_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 306 | 401.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kubira_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 762 | 804.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kubira_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kubira_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1boy, 1girl, blush, draph, hetero, looking_at_viewer, nipples, solo_focus, paizuri, smile, breasts_squeezed_together, open_mouth, penis, collarbone, cum_on_breasts, two-tone_hair, censored, hair_flower, horn_ribbon, jewelry |
| 1 | 44 |  |  |  |  |  | 1girl, draph, looking_at_viewer, solo, smile, black_bikini, official_alternate_costume, blush, cleavage, bare_shoulders, hair_flower, horn_ribbon, layered_bikini, navel, two-tone_hair, parted_bangs |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, belt, black_shorts, cleavage, draph, fur_trim, looking_at_viewer, midriff, navel, off_shoulder, short_shorts, smile, solo, wide_sleeves, blush, collarbone, elbow_gloves, long_sleeves, necklace, open_mouth, thighs, white_gloves, white_thighhighs, gourd, jacket, parted_bangs, sidelocks, simple_background, white_background, cowboy_shot, swept_bangs |
| 3 | 5 |  |  |  |  |  | 1girl, draph, smile, solo, blush, coat, long_sleeves, ribbon, sweater, upper_body, boar, jewelry, looking_at_viewer, red_scarf, snow |
| 4 | 6 |  |  |  |  |  | 1boy, 1girl, draph, hetero, navel, nipples, sex, solo_focus, sweat, parted_bangs, penis, pussy, spread_legs, vaginal, bar_censor, bed_sheet, colored_inner_hair, completely_nude, female_pubic_hair, missionary, on_back, open_mouth, two-tone_hair, cum, horn_ribbon, nose_blush, on_bed |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | blush | draph | hetero | looking_at_viewer | nipples | solo_focus | paizuri | smile | breasts_squeezed_together | open_mouth | penis | collarbone | cum_on_breasts | two-tone_hair | censored | hair_flower | horn_ribbon | jewelry | solo | black_bikini | official_alternate_costume | cleavage | bare_shoulders | layered_bikini | navel | parted_bangs | belt | black_shorts | fur_trim | midriff | off_shoulder | short_shorts | wide_sleeves | elbow_gloves | long_sleeves | necklace | thighs | white_gloves | white_thighhighs | gourd | jacket | sidelocks | simple_background | white_background | cowboy_shot | swept_bangs | coat | ribbon | sweater | upper_body | boar | red_scarf | snow | sex | sweat | pussy | spread_legs | vaginal | bar_censor | bed_sheet | colored_inner_hair | completely_nude | female_pubic_hair | missionary | on_back | cum | nose_blush | on_bed |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:--------|:--------|:---------|:--------------------|:----------|:-------------|:----------|:--------|:----------------------------|:-------------|:--------|:-------------|:-----------------|:----------------|:-----------|:--------------|:--------------|:----------|:-------|:---------------|:-----------------------------|:-----------|:-----------------|:-----------------|:--------|:---------------|:-------|:---------------|:-----------|:----------|:---------------|:---------------|:---------------|:---------------|:---------------|:-----------|:---------|:---------------|:-------------------|:--------|:---------|:------------|:--------------------|:-------------------|:--------------|:--------------|:-------|:---------|:----------|:-------------|:-------|:------------|:-------|:------|:--------|:--------|:--------------|:----------|:-------------|:------------|:---------------------|:------------------|:--------------------|:-------------|:----------|:------|:-------------|:---------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 44 |  |  |  |  |  | | X | X | X | | X | | | | X | | | | | | X | | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | | X | X | X | | X | | | | X | | X | | X | | | | | | | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | | X | X | X | | X | | | | X | | | | | | | | | | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | X | X | | X | X | | | | X | X | | | X | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kubira_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T18:47:13+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T19:57:33+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of kubira (Granblue Fantasy)
====================================
This is the dataset of kubira (Granblue Fantasy), containing 306 images and their tags.
The core tags of this character are 'dark\_skin, blonde\_hair, horns, long\_hair, dark-skinned\_female, pointy\_ears, breasts, large\_breasts, bangs, yellow\_eyes, horn\_ornament, multicolored\_hair, pink\_hair, brown\_eyes, earrings', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
07d1a60e3343a1f0175ee64aef6a6d94bed450e9 | Synthetic Appeals Data
*INITIAL GENERATION IS BROKEN*
Synthetically generated health insurance appeal-related data + health insurance-related summarization type data.
No warranty is expressed or implied. Generated from https://github.com/totallylegitco/healthinsurance-llm
---
language:
- en
tags:
- medical
size_categories:
- 10K<n<100K
---
| TotallyLegitCo/synthetic-appeals | [
"region:us"
] | 2024-01-21T18:57:12+00:00 | {} | 2024-01-21T20:02:32+00:00 | [] | [] | TAGS
#region-us
| Synthetic Appeals Data
*INITIAL GENERATION IS BROKEN*
Synthetically generated health insurance appeal-related data + health insurance-related summarization type data.
No warranty is expressed or implied. Generated from URL
---
language:
- en
tags:
- medical
size_categories:
- 10K<n<100K
---
| [] | [
"TAGS\n#region-us \n"
] |
14546166be44ed677b4b505579962e5cd13ff40d |
# Dataset of mirin (Granblue Fantasy)
This is the dataset of mirin (Granblue Fantasy), containing 73 images and their tags.
The core tags of this character are `blonde_hair, long_hair, blue_eyes, bangs, bow, breasts, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 73 | 85.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mirin_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 73 | 53.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mirin_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 166 | 112.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mirin_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 73 | 77.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mirin_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 166 | 154.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mirin_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mirin_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------|
| 0 | 73 |  |  |  |  |  | 1girl, solo, smile, looking_at_viewer, blush, kimono, holding, open_mouth, katana, wide_sleeves, skirt, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | looking_at_viewer | blush | kimono | holding | open_mouth | katana | wide_sleeves | skirt | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------|:---------|:----------|:-------------|:---------|:---------------|:--------|:-------------------|
| 0 | 73 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/mirin_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T19:03:03+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T19:20:27+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of mirin (Granblue Fantasy)
===================================
This is the dataset of mirin (Granblue Fantasy), containing 73 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, blue\_eyes, bangs, bow, breasts, ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
99eff5a1da9b8183dd75413a6c5e80f7c1b67683 |
# Dataset of europa (Granblue Fantasy)
This is the dataset of europa (Granblue Fantasy), containing 297 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, breasts, short_hair, hair_ornament, hair_flower, bangs, hair_between_eyes, large_breasts, ribbon, blue_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 297 | 456.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/europa_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 297 | 260.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/europa_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 727 | 560.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/europa_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 297 | 406.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/europa_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 727 | 786.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/europa_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/europa_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bare_shoulders, blush, cleavage, collarbone, flower, looking_at_viewer, official_alternate_costume, solo, tiara, white_background, white_bikini, navel, simple_background, thighs, smile, closed_mouth, open_mouth |
| 1 | 7 |  |  |  |  |  | 1girl, bare_shoulders, blue_sky, cleavage, cloud, collarbone, day, flower, looking_at_viewer, official_alternate_costume, outdoors, solo, tiara, white_bikini, ocean, open_mouth, :d, navel, water, blush |
| 2 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, collarbone, flower, looking_at_viewer, medium_breasts, simple_background, solo, tiara, white_background, white_dress, blue_cape, boots, full_body, short_dress, white_footwear, smile, strapless_dress |
| 3 | 9 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, collarbone, flower, looking_at_viewer, smile, solo, tiara, white_dress, blue_cape, simple_background, blush, thighs, short_dress, white_background, closed_mouth |
| 4 | 7 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, closed_mouth, collarbone, looking_at_viewer, short_dress, sitting, smile, solo, strapless_dress, tiara, white_dress, rose, blush, medium_breasts, thighs, blue_cape, boots, orange_flower, petals, white_background |
| 5 | 9 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, penis, pussy, sex, sweat, vaginal, flower, looking_at_viewer, spread_legs, girl_on_top, mosaic_censoring, open_mouth, solo_focus, tiara, collarbone, cowgirl_position, navel, smile, bare_shoulders, nude |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blush | cleavage | collarbone | flower | looking_at_viewer | official_alternate_costume | solo | tiara | white_background | white_bikini | navel | simple_background | thighs | smile | closed_mouth | open_mouth | blue_sky | cloud | day | outdoors | ocean | :d | water | medium_breasts | white_dress | blue_cape | boots | full_body | short_dress | white_footwear | strapless_dress | sitting | rose | orange_flower | petals | 1boy | hetero | nipples | penis | pussy | sex | sweat | vaginal | spread_legs | girl_on_top | mosaic_censoring | solo_focus | cowgirl_position | nude |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:-----------|:-------------|:---------|:--------------------|:-----------------------------|:-------|:--------|:-------------------|:---------------|:--------|:--------------------|:---------|:--------|:---------------|:-------------|:-----------|:--------|:------|:-----------|:--------|:-----|:--------|:-----------------|:--------------|:------------|:--------|:------------|:--------------|:-----------------|:------------------|:----------|:-------|:----------------|:---------|:-------|:---------|:----------|:--------|:--------|:------|:--------|:----------|:--------------|:--------------|:-------------------|:-------------|:-------------------|:-------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | | X | X | X | X | | X | X | X | | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | | | X | X | X | X | | | | | | | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | X | X | X | | X | | X | X | X | | | | X | X | X | | | | | | | | | X | X | X | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | X | X | | X | X | X | | | X | | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/europa_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T19:03:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:10:38+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of europa (Granblue Fantasy)
====================================
This is the dataset of europa (Granblue Fantasy), containing 297 images and their tags.
The core tags of this character are 'blonde\_hair, blue\_eyes, breasts, short\_hair, hair\_ornament, hair\_flower, bangs, hair\_between\_eyes, large\_breasts, ribbon, blue\_ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
cd4f67e1b151d9fa190b5b3285980ef7a76267a8 |
# Dataset of fraux (Granblue Fantasy)
This is the dataset of fraux (Granblue Fantasy), containing 117 images and their tags.
The core tags of this character are `animal_ears, long_hair, braid, breasts, twintails, bangs, braided_bangs, medium_breasts, swept_bangs, grey_hair, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 117 | 176.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fraux_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 117 | 102.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fraux_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 300 | 224.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fraux_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 117 | 162.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fraux_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 300 | 315.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fraux_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fraux_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, collarbone, erune, looking_at_viewer, smile, solo, cleavage, blush, closed_mouth, upper_body, simple_background, white_background, bare_shoulders, choker |
| 1 | 10 |  |  |  |  |  | 1girl, erune, looking_at_viewer, solo, cleavage, smile, black_thighhighs, blush, open_robe, black_leotard, closed_mouth, collarbone, bare_shoulders, long_sleeves, simple_background, thighs, hooded_robe, off_shoulder, open_coat, white_background, wide_sleeves, choker |
| 2 | 5 |  |  |  |  |  | 1girl, erune, hooded_robe, looking_at_viewer, open_mouth, solo, :d, open_robe, bare_shoulders, black_leotard, blush, cleavage, thighs, ass, black_thighhighs, collarbone, long_sleeves, red_eyes, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collarbone | erune | looking_at_viewer | smile | solo | cleavage | blush | closed_mouth | upper_body | simple_background | white_background | bare_shoulders | choker | black_thighhighs | open_robe | black_leotard | long_sleeves | thighs | hooded_robe | off_shoulder | open_coat | wide_sleeves | open_mouth | :d | ass | red_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------|:--------------------|:--------|:-------|:-----------|:--------|:---------------|:-------------|:--------------------|:-------------------|:-----------------|:---------|:-------------------|:------------|:----------------|:---------------|:---------|:--------------|:---------------|:------------|:---------------|:-------------|:-----|:------|:-----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | | X | X | X | | | | X | X | | X | X | X | X | X | X | | | | X | X | X | X |
| CyberHarem/fraux_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T19:03:10+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T19:27:16+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of fraux (Granblue Fantasy)
===================================
This is the dataset of fraux (Granblue Fantasy), containing 117 images and their tags.
The core tags of this character are 'animal\_ears, long\_hair, braid, breasts, twintails, bangs, braided\_bangs, medium\_breasts, swept\_bangs, grey\_hair, brown\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
67ec68723a0b704c0594164b3b374f46d31395db |
# Dataset of silva (Granblue Fantasy)
This is the dataset of silva (Granblue Fantasy), containing 341 images and their tags.
The core tags of this character are `long_hair, breasts, yellow_eyes, braid, ahoge, large_breasts, twin_braids, hair_between_eyes, very_long_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 341 | 384.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silva_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 341 | 263.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silva_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 754 | 504.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silva_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 341 | 359.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silva_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 754 | 642.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silva_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/silva_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | blush, pantyhose, playboy_bunny, rabbit_ears, 1girl, leotard, solo, fake_animal_ears, looking_at_viewer, bare_shoulders, detached_collar, white_background, ass, cleavage, open_mouth, rabbit_tail, simple_background, wrist_cuffs |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, belt, blush, cleavage, collarbone, crop_top, looking_at_viewer, midriff, solo, navel, simple_background, wavy_hair, armpits, smile, white_background, arm_behind_head, arms_up, black_pants, brown_eyes, closed_mouth, open_clothes, ponytail, sleeveless_shirt, upper_body |
| 2 | 15 |  |  |  |  |  | 1girl, cleavage, midriff, solo, belt, navel, looking_at_viewer, crop_top, black_jacket, long_sleeves, open_jacket, collarbone, white_shirt, ponytail, white_background, simple_background, wavy_hair, black_pants, gun, stomach |
| 3 | 47 |  |  |  |  |  | 1girl, midriff, solo, belt, cleavage, miniskirt, navel, rifle, looking_at_viewer, boots, crop_top, thighhighs, holding |
| 4 | 17 |  |  |  |  |  | cleavage, 1girl, collarbone, looking_at_viewer, solo, bare_shoulders, blue_bikini, navel, blush, sarong, smile, sun_hat, sitting |
| 5 | 5 |  |  |  |  |  | blue_bikini, blue_sky, blush, cleavage, day, looking_at_viewer, navel, smile, 1girl, bare_shoulders, beach, collarbone, outdoors, solo, thighs, cloud, side-tie_bikini_bottom, water, grey_hair, micro_bikini, ocean, open_mouth, palm_tree, sitting, standing, wavy_hair |
| 6 | 22 |  |  |  |  |  | 1girl, official_alternate_costume, bare_shoulders, looking_at_viewer, ponytail, solo, choker, cleavage, collarbone, blue_dress, blush, smile, white_background, simple_background, bracelet, wavy_hair, purple_dress, closed_mouth, thighs |
| 7 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, paizuri, solo_focus, nipples, penis, huge_breasts, collarbone, cum, mosaic_censoring, open_mouth, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | pantyhose | playboy_bunny | rabbit_ears | 1girl | leotard | solo | fake_animal_ears | looking_at_viewer | bare_shoulders | detached_collar | white_background | ass | cleavage | open_mouth | rabbit_tail | simple_background | wrist_cuffs | belt | collarbone | crop_top | midriff | navel | wavy_hair | armpits | smile | arm_behind_head | arms_up | black_pants | brown_eyes | closed_mouth | open_clothes | ponytail | sleeveless_shirt | upper_body | black_jacket | long_sleeves | open_jacket | white_shirt | gun | stomach | miniskirt | rifle | boots | thighhighs | holding | blue_bikini | sarong | sun_hat | sitting | blue_sky | day | beach | outdoors | thighs | cloud | side-tie_bikini_bottom | water | grey_hair | micro_bikini | ocean | palm_tree | standing | official_alternate_costume | choker | blue_dress | bracelet | purple_dress | 1boy | hetero | paizuri | solo_focus | nipples | penis | huge_breasts | cum | mosaic_censoring |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:----------------|:--------------|:--------|:----------|:-------|:-------------------|:--------------------|:-----------------|:------------------|:-------------------|:------|:-----------|:-------------|:--------------|:--------------------|:--------------|:-------|:-------------|:-----------|:----------|:--------|:------------|:----------|:--------|:------------------|:----------|:--------------|:-------------|:---------------|:---------------|:-----------|:-------------------|:-------------|:---------------|:---------------|:--------------|:--------------|:------|:----------|:------------|:--------|:--------|:-------------|:----------|:--------------|:---------|:----------|:----------|:-----------|:------|:--------|:-----------|:---------|:--------|:-------------------------|:--------|:------------|:---------------|:--------|:------------|:-----------|:-----------------------------|:---------|:-------------|:-----------|:---------------|:-------|:---------|:----------|:-------------|:----------|:--------|:---------------|:------|:-------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | | X | | X | | X | X | | X | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | | | | | X | | X | | X | | | X | | X | | | X | | X | X | X | X | X | X | | | | | X | | | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 47 |  |  |  |  |  | | | | | X | | X | | X | | | | | X | | | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 17 |  |  |  |  |  | X | | | | X | | X | | X | X | | | | X | | | | | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | X | | X | | X | X | | | | X | X | | | | | X | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 6 | 22 |  |  |  |  |  | X | | | | X | | X | | X | X | | X | | X | | | X | | | X | | | | X | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | | X | | | | | | | | | | X | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/silva_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T19:03:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:05:18+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of silva (Granblue Fantasy)
===================================
This is the dataset of silva (Granblue Fantasy), containing 341 images and their tags.
The core tags of this character are 'long\_hair, breasts, yellow\_eyes, braid, ahoge, large\_breasts, twin\_braids, hair\_between\_eyes, very\_long\_hair, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
9135c3367bbaba756b47c2c870b250bc9d4c9054 | # Dataset Card for german-court-decisions
60k judicial decisions in Germany retrieved on January 1, 2024.
# Dataset Description
**Language(s) (NLP):** German<br>
**License:** MIT
Copyright notice: Automated retrieval of decisions from federal and state databases in Germany is permitted for non-commercial purposes only. As a result, the use of this dataset is permitted for non-commercial purposes only.
## Uses
Prediction of verdicts based on statement of facts.
### Direct Use
Classification convicted/dismissed etc.
### Out-of-Scope Use
Text generation of verdicts based on statement of facts.
## Dataset Structure
```
features: ['court', 'state', 'date', 'offense', 'decision', 'costs to defendant', 'convicted', 'costs to plaintiff', 'dismissed']
```
```court```:  court name (string), e.g. 'AG Bad Neuenahr-Ahrweiler'; may be empty if it does not match the search mask<br>
```state```:  federated state (string), e.g. 'Brandenburg', 'Berlin', ... , 'Thüringen' <br>
```date```:  date (string), e.g. '03.04.2013'; may be empty if it does not match the search mask<br>
```offense```:  statement of facts (string; _Tatbestand_ in German)<br>
```decision```:  verdict (string; _Tenor_ in German)<br>
```costs to defendant```:  Boolean (0/1), text phrase detection (see data processing below)<br>
```convicted```:  Boolean (0/1), text phrase detection (see data processing below)<br>
```costs to plaintiff```:  Boolean (0/1), text phrase detection (see data processing below)<br>
```dismissed```:  Boolean (0/1), text phrase detection (see data processing below)<br>
## Dataset Creation, Data Collection and Processing
Data creation:  Federal governments in Germany<br>
Data collection:  [github.com/niklaswais/gesp](https://github.com/niklaswais/gesp) on January 1, 2024<br>
Data processing:  [github.com/SH-108/german_court_decisions](https://github.com/SH-108/german_court_decisions/blob/main/postprocess_court_decisions.ipynb) | SH108/german-court-decisions | [
"task_categories:text-classification",
"size_categories:10K<n<100K",
"language:de",
"license:mit",
"legal",
"region:us"
] | 2024-01-21T19:17:30+00:00 | {"language": ["de"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["text-classification"], "pretty_name": "german_court_decisions", "tags": ["legal"]} | 2024-01-24T14:38:32+00:00 | [] | [
"de"
] | TAGS
#task_categories-text-classification #size_categories-10K<n<100K #language-German #license-mit #legal #region-us
| # Dataset Card for german-court-decisions
60k judicial decisions in Germany retrieved on January 1, 2024.
# Dataset Description
Language(s) (NLP): German<br>
License: MIT
Copyright notice: Automated retrieval of decisions from federal and state databases in Germany is permitted for non-commercial purposes only. As a result, the use of this dataset is permitted for non-commercial purposes only.
## Uses
Prediction of verdicts based on statement of facts.
### Direct Use
Classification convicted/dismissed etc.
### Out-of-Scope Use
Text generation of verdicts based on statement of facts.
## Dataset Structure
:  court name (string), e.g. 'AG Bad Neuenahr-Ahrweiler'; may be empty if it does not match the search mask<br>
:  federated state (string), e.g. 'Brandenburg', 'Berlin', ... , 'Thüringen' <br>
:  date (string), e.g. '03.04.2013'; may be empty if it does not match the search mask<br>
:  statement of facts (string; _Tatbestand_ in German)<br>
:  verdict (string; _Tenor_ in German)<br>
:  Boolean (0/1), text phrase detection (see data processing below)<br>
:  Boolean (0/1), text phrase detection (see data processing below)<br>
:  Boolean (0/1), text phrase detection (see data processing below)<br>
:  Boolean (0/1), text phrase detection (see data processing below)<br>
## Dataset Creation, Data Collection and Processing
Data creation:  Federal governments in Germany<br>
Data collection:  URL on January 1, 2024<br>
Data processing:  URL | [
"# Dataset Card for german-court-decisions\n60k judicial decisions in Germany retrieved on January 1, 2024.",
"# Dataset Description\n\nLanguage(s) (NLP): German<br>\nLicense: MIT\n\nCopyright notice: Automated retrieval of decisions from federal and state databases in Germany is permitted for non-commercial purposes only. As a result, the use of this dataset is permitted for non-commercial purposes only.",
"## Uses\n\nPrediction of verdicts based on statement of facts.",
"### Direct Use\n\nClassification convicted/dismissed etc.",
"### Out-of-Scope Use\n\nText generation of verdicts based on statement of facts.",
"## Dataset Structure\n\n\n\n:  court name (string), e.g. 'AG Bad Neuenahr-Ahrweiler'; may be empty if it does not match the search mask<br>\n:  federated state (string), e.g. 'Brandenburg', 'Berlin', ... , 'Thüringen' <br>\n:  date (string), e.g. '03.04.2013'; may be empty if it does not match the search mask<br>\n:  statement of facts (string; _Tatbestand_ in German)<br>\n:  verdict (string; _Tenor_ in German)<br>\n\n:  Boolean (0/1), text phrase detection (see data processing below)<br>\n:  Boolean (0/1), text phrase detection (see data processing below)<br>\n:  Boolean (0/1), text phrase detection (see data processing below)<br>\n:  Boolean (0/1), text phrase detection (see data processing below)<br>",
"## Dataset Creation, Data Collection and Processing\n\nData creation:  Federal governments in Germany<br>\nData collection:  URL on January 1, 2024<br>\nData processing:  URL"
] | [
"TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-German #license-mit #legal #region-us \n",
"# Dataset Card for german-court-decisions\n60k judicial decisions in Germany retrieved on January 1, 2024.",
"# Dataset Description\n\nLanguage(s) (NLP): German<br>\nLicense: MIT\n\nCopyright notice: Automated retrieval of decisions from federal and state databases in Germany is permitted for non-commercial purposes only. As a result, the use of this dataset is permitted for non-commercial purposes only.",
"## Uses\n\nPrediction of verdicts based on statement of facts.",
"### Direct Use\n\nClassification convicted/dismissed etc.",
"### Out-of-Scope Use\n\nText generation of verdicts based on statement of facts.",
"## Dataset Structure\n\n\n\n:  court name (string), e.g. 'AG Bad Neuenahr-Ahrweiler'; may be empty if it does not match the search mask<br>\n:  federated state (string), e.g. 'Brandenburg', 'Berlin', ... , 'Thüringen' <br>\n:  date (string), e.g. '03.04.2013'; may be empty if it does not match the search mask<br>\n:  statement of facts (string; _Tatbestand_ in German)<br>\n:  verdict (string; _Tenor_ in German)<br>\n\n:  Boolean (0/1), text phrase detection (see data processing below)<br>\n:  Boolean (0/1), text phrase detection (see data processing below)<br>\n:  Boolean (0/1), text phrase detection (see data processing below)<br>\n:  Boolean (0/1), text phrase detection (see data processing below)<br>",
"## Dataset Creation, Data Collection and Processing\n\nData creation:  Federal governments in Germany<br>\nData collection:  URL on January 1, 2024<br>\nData processing:  URL"
] |
8aad0b5cd79aa16ada3ac89033d255e402a5b3ea |
# Dataset of you (Granblue Fantasy)
This is the dataset of you (Granblue Fantasy), containing 58 images and their tags.
The core tags of this character are `animal_ears, fox_ears, short_hair, hair_ornament, tail, fox_tail, fox_girl, red_eyes, blue_hair, multiple_tails, hair_between_eyes, bangs, breasts, large_tail, black_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 58 | 82.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/you_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 58 | 46.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/you_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 131 | 97.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/you_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 58 | 72.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/you_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 131 | 138.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/you_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/you_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, erune, solo, collarbone, looking_at_viewer, bare_shoulders, blush, open_mouth, white_background, backless_outfit, fang, simple_background |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blush, erune, solo, closed_mouth, collarbone, looking_at_viewer, shorts, backless_outfit, off_shoulder, simple_background, small_breasts, white_background, bandages, navel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | erune | solo | collarbone | looking_at_viewer | bare_shoulders | blush | open_mouth | white_background | backless_outfit | fang | simple_background | closed_mouth | shorts | off_shoulder | small_breasts | bandages | navel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-------------|:--------------------|:-----------------|:--------|:-------------|:-------------------|:------------------|:-------|:--------------------|:---------------|:---------|:---------------|:----------------|:-----------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | X | X | X | X | X | X | X |
| CyberHarem/you_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T19:21:51+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T19:33:24+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of you (Granblue Fantasy)
=================================
This is the dataset of you (Granblue Fantasy), containing 58 images and their tags.
The core tags of this character are 'animal\_ears, fox\_ears, short\_hair, hair\_ornament, tail, fox\_tail, fox\_girl, red\_eyes, blue\_hair, multiple\_tails, hair\_between\_eyes, bangs, breasts, large\_tail, black\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
706c8f5aa5be957c493b1beda60cc49551fcdb31 |
# Dataset of anthuria (Granblue Fantasy)
This is the dataset of anthuria (Granblue Fantasy), containing 177 images and their tags.
The core tags of this character are `animal_ears, long_hair, red_hair, breasts, red_eyes, hair_ornament, bangs, medium_breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 177 | 301.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anthuria_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 177 | 169.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anthuria_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 433 | 358.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anthuria_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 177 | 270.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anthuria_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 433 | 520.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anthuria_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/anthuria_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 32 |  |  |  |  |  | 1girl, bare_shoulders, erune, red_dress, solo, looking_at_viewer, black_gloves, elbow_gloves, cleavage, smile, black_thighhighs, blush, single_glove, detached_sleeves, open_mouth, simple_background |
| 1 | 7 |  |  |  |  |  | 1girl, bare_shoulders, blush, erune, looking_at_viewer, looking_back, solo, bare_back, detached_sleeves, from_behind, backless_outfit, hair_flower, sideboob, braid, shoulder_blades, smile, upper_body, black_bow, closed_mouth, hair_bow, kimono |
| 2 | 16 |  |  |  |  |  | 1girl, erune, looking_at_viewer, solo, black_kimono, hair_flower, bare_shoulders, blush, detached_sleeves, wide_sleeves, obi, black_gloves, braid, closed_mouth, long_sleeves, simple_background, smile, white_background, floral_print, hair_between_eyes, single_glove, bow, bracelet, cleavage_cutout, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | erune | red_dress | solo | looking_at_viewer | black_gloves | elbow_gloves | cleavage | smile | black_thighhighs | blush | single_glove | detached_sleeves | open_mouth | simple_background | looking_back | bare_back | from_behind | backless_outfit | hair_flower | sideboob | braid | shoulder_blades | upper_body | black_bow | closed_mouth | hair_bow | kimono | black_kimono | wide_sleeves | obi | long_sleeves | white_background | floral_print | hair_between_eyes | bow | bracelet | cleavage_cutout |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:------------|:-------|:--------------------|:---------------|:---------------|:-----------|:--------|:-------------------|:--------|:---------------|:-------------------|:-------------|:--------------------|:---------------|:------------|:--------------|:------------------|:--------------|:-----------|:--------|:------------------|:-------------|:------------|:---------------|:-----------|:---------|:---------------|:---------------|:------|:---------------|:-------------------|:---------------|:--------------------|:------|:-----------|:------------------|
| 0 | 32 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | X | | | | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 2 | 16 |  |  |  |  |  | X | X | X | | X | X | X | | | X | | X | X | X | | X | | | | | X | | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/anthuria_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T19:21:58+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:00:29+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of anthuria (Granblue Fantasy)
======================================
This is the dataset of anthuria (Granblue Fantasy), containing 177 images and their tags.
The core tags of this character are 'animal\_ears, long\_hair, red\_hair, breasts, red\_eyes, hair\_ornament, bangs, medium\_breasts, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
0f0338d359215fb384f2ce44c4e2ef85f32e3be6 |
# Dataset of izmir (Granblue Fantasy)
This is the dataset of izmir (Granblue Fantasy), containing 198 images and their tags.
The core tags of this character are `horns, long_hair, breasts, pointy_ears, hair_ornament, hair_between_eyes, large_breasts, mole_under_mouth, mole, blue_hair, hair_flower, red_eyes, very_long_hair, ponytail, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 198 | 273.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izmir_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 198 | 166.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izmir_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 472 | 348.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izmir_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 198 | 244.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izmir_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 472 | 474.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izmir_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/izmir_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, draph, solo, elbow_gloves, looking_at_viewer, white_dress, white_gloves, brown_eyes, fur_collar, white_headwear, smile, blush, rose, snowflake_hair_ornament |
| 1 | 11 |  |  |  |  |  | 1boy, 1girl, bare_shoulders, blush, draph, flower, solo_focus, cum_on_breasts, smile, snowflake_hair_ornament, collarbone, elbow_gloves, penis, white_gloves, bangs, dress, huge_breasts, open_mouth, paizuri_under_clothes, brown_eyes, eyes_visible_through_hair, looking_at_viewer, white_headwear, facial, heart, nipples |
| 2 | 7 |  |  |  |  |  | 1girl, bare_shoulders, blush, draph, flower, official_alternate_costume, smile, solo, looking_at_viewer, open_mouth, underboob_cutout, white_one-piece_swimsuit, collarbone, simple_background, white_background, navel, snowflake_hair_ornament, thigh_strap, thighs |
| 3 | 9 |  |  |  |  |  | 1girl, bare_shoulders, draph, official_alternate_costume, smile, solo, blush, flower, navel, swimsuit, looking_at_viewer, open_mouth, simple_background, snowflake_hair_ornament, underboob_cutout |
| 4 | 7 |  |  |  |  |  | 1girl, draph, looking_at_viewer, smile, solo, blush, braid, hair_bow, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | cleavage | draph | solo | elbow_gloves | looking_at_viewer | white_dress | white_gloves | brown_eyes | fur_collar | white_headwear | smile | blush | rose | snowflake_hair_ornament | 1boy | flower | solo_focus | cum_on_breasts | collarbone | penis | bangs | dress | huge_breasts | open_mouth | paizuri_under_clothes | eyes_visible_through_hair | facial | heart | nipples | official_alternate_costume | underboob_cutout | white_one-piece_swimsuit | simple_background | white_background | navel | thigh_strap | thighs | swimsuit | braid | hair_bow | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------|:--------|:-------|:---------------|:--------------------|:--------------|:---------------|:-------------|:-------------|:-----------------|:--------|:--------|:-------|:--------------------------|:-------|:---------|:-------------|:-----------------|:-------------|:--------|:--------|:--------|:---------------|:-------------|:------------------------|:----------------------------|:---------|:--------|:----------|:-----------------------------|:-------------------|:---------------------------|:--------------------|:-------------------|:--------|:--------------|:---------|:-----------|:--------|:-----------|:----------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | | X | | X | X | | X | X | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | X | X | | X | | | | | | X | X | | X | | X | | | X | | | | | X | | | | | | X | X | X | X | X | X | X | X | | | | |
| 3 | 9 |  |  |  |  |  | X | X | | X | X | | X | | | | | | X | X | | X | | X | | | | | | | | X | | | | | | X | X | | X | | X | | | X | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | X | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X |
| CyberHarem/izmir_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T19:32:32+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:12:45+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of izmir (Granblue Fantasy)
===================================
This is the dataset of izmir (Granblue Fantasy), containing 198 images and their tags.
The core tags of this character are 'horns, long\_hair, breasts, pointy\_ears, hair\_ornament, hair\_between\_eyes, large\_breasts, mole\_under\_mouth, mole, blue\_hair, hair\_flower, red\_eyes, very\_long\_hair, ponytail, hat', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
6240f523fb3fbc0239a1c84bd682aabad89f7c3a |
# Dataset of melissabelle (Granblue Fantasy)
This is the dataset of melissabelle (Granblue Fantasy), containing 70 images and their tags.
The core tags of this character are `blonde_hair, long_hair, pointy_ears, very_long_hair, ahoge, green_eyes, bangs, hair_intakes, prehensile_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 70 | 72.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissabelle_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 70 | 48.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissabelle_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 154 | 99.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissabelle_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 70 | 67.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissabelle_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 154 | 127.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissabelle_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/melissabelle_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 36 |  |  |  |  |  | 1girl, solo, white_dress, harvin, bare_shoulders, strap_slip, bracelet, blush, looking_at_viewer, white_background, collarbone, closed_mouth, simple_background, hair_ornament, smile, hair_between_eyes |
| 1 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, ponytail, smile, solo, braid, hair_bow, harvin, blush, long_sleeves, white_dress, black_bow, closed_mouth, basket, corn, food, heart, holding, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_dress | harvin | bare_shoulders | strap_slip | bracelet | blush | looking_at_viewer | white_background | collarbone | closed_mouth | simple_background | hair_ornament | smile | hair_between_eyes | ponytail | braid | hair_bow | long_sleeves | black_bow | basket | corn | food | heart | holding |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------|:---------|:-----------------|:-------------|:-----------|:--------|:--------------------|:-------------------|:-------------|:---------------|:--------------------|:----------------|:--------|:--------------------|:-----------|:--------|:-----------|:---------------|:------------|:---------|:-------|:-------|:--------|:----------|
| 0 | 36 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | | | | X | X | X | | X | | | X | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/melissabelle_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T19:47:59+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:00:47+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of melissabelle (Granblue Fantasy)
==========================================
This is the dataset of melissabelle (Granblue Fantasy), containing 70 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, pointy\_ears, very\_long\_hair, ahoge, green\_eyes, bangs, hair\_intakes, prehensile\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
8545c7522593977592f77213c66e2543dd44b880 | [<img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-dark.png" alt="Built with Distilabel" width="200" height="32"/>](https://github.com/argilla-io/distilabel)
| macadeliccc/distilabel-code-preferences | [
"synthetic",
"distilabel",
"region:us"
] | 2024-01-21T19:55:18+00:00 | {"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "generation_model", "sequence": "string"}, {"name": "generation_prompt", "list": {"list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}}, {"name": "raw_generation_responses", "sequence": "string"}, {"name": "generations", "sequence": "string"}, {"name": "labelling_model", "dtype": "string"}, {"name": "labelling_prompt", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "raw_labelling_response", "dtype": "string"}, {"name": "rating", "sequence": "float64"}, {"name": "rationale", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 24108646, "num_examples": 1000}], "download_size": 9040100, "dataset_size": 24108646}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["synthetic", "distilabel"]} | 2024-02-05T16:28:55+00:00 | [] | [] | TAGS
#synthetic #distilabel #region-us
| <img src="URL alt="Built with Distilabel" width="200" height="32"/>
| [] | [
"TAGS\n#synthetic #distilabel #region-us \n"
] |
64c6dd92ad4aa08fdac9f25b4a38d8d53231a806 |
# Dataset of kolulu (Granblue Fantasy)
This is the dataset of kolulu (Granblue Fantasy), containing 45 images and their tags.
The core tags of this character are `green_hair, hair_over_one_eye, long_hair, dark_skin, dark-skinned_female, green_eyes, hat, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 45 | 75.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kolulu_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 45 | 39.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kolulu_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 111 | 86.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kolulu_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 45 | 64.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kolulu_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 111 | 130.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kolulu_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kolulu_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, solo, navel, hat_flower, outdoors, beach, looking_at_viewer, open_mouth, smile, white_bikini, blush, holding, ocean, sky, day, food |
| 1 | 7 |  |  |  |  |  | 1girl, bandaged_leg, solo, looking_at_viewer, navel, cloak, holding, staff, closed_mouth, skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | navel | hat_flower | outdoors | beach | looking_at_viewer | open_mouth | smile | white_bikini | blush | holding | ocean | sky | day | food | bandaged_leg | cloak | staff | closed_mouth | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------------|:-----------|:--------|:--------------------|:-------------|:--------|:---------------|:--------|:----------|:--------|:------|:------|:-------|:---------------|:--------|:--------|:---------------|:--------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | | | X | | | | | X | | | | | X | X | X | X | X |
| CyberHarem/kolulu_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T20:02:30+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:13:18+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of kolulu (Granblue Fantasy)
====================================
This is the dataset of kolulu (Granblue Fantasy), containing 45 images and their tags.
The core tags of this character are 'green\_hair, hair\_over\_one\_eye, long\_hair, dark\_skin, dark-skinned\_female, green\_eyes, hat, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
40eeef6be2d112ab74c21a3e231f952a383bd13e |
# Dataset of esser (Granblue Fantasy)
This is the dataset of esser (Granblue Fantasy), containing 87 images and their tags.
The core tags of this character are `long_hair, animal_ears, pink_hair, blue_eyes, braid, breasts, very_long_hair, ahoge, bangs, large_breasts, twin_braids`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 87 | 96.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/esser_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 87 | 59.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/esser_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 181 | 115.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/esser_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 87 | 86.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/esser_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 181 | 161.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/esser_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/esser_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, erune, simple_background, solo, white_background, bare_shoulders, cleavage, collarbone, looking_at_viewer, hair_ornament, panties, see-through, thighs |
| 1 | 12 |  |  |  |  |  | 1girl, erune, solo, makeup, ass, bare_shoulders, leotard, looking_at_viewer, cape, handgun, holding_gun |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | erune | simple_background | solo | white_background | bare_shoulders | cleavage | collarbone | looking_at_viewer | hair_ornament | panties | see-through | thighs | makeup | ass | leotard | cape | handgun | holding_gun |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:--------------------|:-------|:-------------------|:-----------------|:-----------|:-------------|:--------------------|:----------------|:----------|:--------------|:---------|:---------|:------|:----------|:-------|:----------|:--------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | X | | X | | X | | | X | | | | | X | X | X | X | X | X |
| CyberHarem/esser_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T20:02:32+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:22:53+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of esser (Granblue Fantasy)
===================================
This is the dataset of esser (Granblue Fantasy), containing 87 images and their tags.
The core tags of this character are 'long\_hair, animal\_ears, pink\_hair, blue\_eyes, braid, breasts, very\_long\_hair, ahoge, bangs, large\_breasts, twin\_braids', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
d776e9b06cbe017043bfd0d87936e20fdbcabd53 |
# Dataset of razia (Granblue Fantasy)
This is the dataset of razia (Granblue Fantasy), containing 137 images and their tags.
The core tags of this character are `horns, long_hair, blonde_hair, breasts, blue_eyes, pointy_ears, large_breasts, very_long_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 137 | 171.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/razia_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 137 | 100.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/razia_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 324 | 211.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/razia_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 137 | 154.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/razia_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 324 | 288.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/razia_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/razia_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | blush, draph, playboy_bunny, rabbit_ears, 1girl, fake_animal_ears, wrist_cuffs, blue_leotard, cleavage, detached_collar, solo, white_background, bare_shoulders, bowtie, simple_background, blue_footwear, full_body, high_heels, navel_cutout, rabbit_tail, fake_tail, open_mouth, thigh_strap, tray, white_pantyhose |
| 1 | 6 |  |  |  |  |  | 1girl, armor, draph, looking_at_viewer, solo, blush, cleavage, gauntlets, gloves, open_mouth, simple_background, thighhighs, bare_shoulders, skirt, white_background, holding_weapon |
| 2 | 10 |  |  |  |  |  | 1girl, draph, hat, solo, looking_at_viewer, black_gloves, blush, simple_background, juliet_sleeves, white_background, dress, pelvic_curtain, black_thighhighs |
| 3 | 9 |  |  |  |  |  | 1girl, draph, looking_at_viewer, ponytail, solo, blue_skirt, blush, bag, black_thighhighs, frills, green_jacket, hair_bow, long_sleeves, navel, necklace, school_uniform, belt, blazer, cleavage, open_mouth, simple_background, white_shirt, hand_on_hip, miniskirt, open_jacket, panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | draph | playboy_bunny | rabbit_ears | 1girl | fake_animal_ears | wrist_cuffs | blue_leotard | cleavage | detached_collar | solo | white_background | bare_shoulders | bowtie | simple_background | blue_footwear | full_body | high_heels | navel_cutout | rabbit_tail | fake_tail | open_mouth | thigh_strap | tray | white_pantyhose | armor | looking_at_viewer | gauntlets | gloves | thighhighs | skirt | holding_weapon | hat | black_gloves | juliet_sleeves | dress | pelvic_curtain | black_thighhighs | ponytail | blue_skirt | bag | frills | green_jacket | hair_bow | long_sleeves | navel | necklace | school_uniform | belt | blazer | white_shirt | hand_on_hip | miniskirt | open_jacket | panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:----------------|:--------------|:--------|:-------------------|:--------------|:---------------|:-----------|:------------------|:-------|:-------------------|:-----------------|:---------|:--------------------|:----------------|:------------|:-------------|:---------------|:--------------|:------------|:-------------|:--------------|:-------|:------------------|:--------|:--------------------|:------------|:---------|:-------------|:--------|:-----------------|:------|:---------------|:-----------------|:--------|:-----------------|:-------------------|:-----------|:-------------|:------|:---------|:---------------|:-----------|:---------------|:--------|:-----------|:-----------------|:-------|:---------|:--------------|:--------------|:------------|:--------------|:----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | | X | | | | X | | X | X | X | | X | | | | | | | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | | | X | | | | | | X | X | | | X | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | | | X | | | | X | | X | | | | X | | | | | | | X | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/razia_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T20:02:38+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:28:52+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of razia (Granblue Fantasy)
===================================
This is the dataset of razia (Granblue Fantasy), containing 137 images and their tags.
The core tags of this character are 'horns, long\_hair, blonde\_hair, breasts, blue\_eyes, pointy\_ears, large\_breasts, very\_long\_hair, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a86a3e6c0c1977d895e8cdd5b02d47bb95d97176 |
# BabyLM 100M
This curated dataset is originally from the [BabyLM Challenge](https://babylm.github.io/guidelines.html).
It consists of ~100M words of mixed domain, consisting of the following sources:
- CHILDES (child-directed speech)
- Subtitles (speech)
- BNC (speech)
- TED talks (speech)
- children's books (simple written language) | nilq/babylm-100M | [
"language:en",
"region:us"
] | 2024-01-21T20:09:47+00:00 | {"language": ["en"], "pretty_name": "BabyLM 100M", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 569602489, "num_examples": 10587551}, {"name": "validation", "num_bytes": 55093483, "num_examples": 1026747}, {"name": "test", "num_bytes": 60175255, "num_examples": 1054646}], "download_size": 429629738, "dataset_size": 684871227}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-21T20:20:14+00:00 | [] | [
"en"
] | TAGS
#language-English #region-us
|
# BabyLM 100M
This curated dataset is originally from the BabyLM Challenge.
It consists of ~100M words of mixed domain, consisting of the following sources:
- CHILDES (child-directed speech)
- Subtitles (speech)
- BNC (speech)
- TED talks (speech)
- children's books (simple written language) | [
"# BabyLM 100M\n\nThis curated dataset is originally from the BabyLM Challenge.\n\nIt consists of ~100M words of mixed domain, consisting of the following sources:\n- CHILDES (child-directed speech)\n- Subtitles (speech)\n- BNC (speech)\n- TED talks (speech)\n- children's books (simple written language)"
] | [
"TAGS\n#language-English #region-us \n",
"# BabyLM 100M\n\nThis curated dataset is originally from the BabyLM Challenge.\n\nIt consists of ~100M words of mixed domain, consisting of the following sources:\n- CHILDES (child-directed speech)\n- Subtitles (speech)\n- BNC (speech)\n- TED talks (speech)\n- children's books (simple written language)"
] |
3318c884a7e21311c0ae358058dde4d55ffa7121 |
# BabyLM 10M
This curated dataset is originally from the [BabyLM Challenge](https://babylm.github.io/guidelines.html).
It consists of ~10M words of mixed domain, consisting of the following sources:
- CHILDES (child-directed speech)
- Subtitles (speech)
- BNC (speech)
- TED talks (speech)
- children's books (simple written language)
| nilq/babylm-10M | [
"language:en",
"region:us"
] | 2024-01-21T20:10:47+00:00 | {"language": ["en"], "pretty_name": "BabyLM 10M", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 57802971, "num_examples": 1058740}, {"name": "validation", "num_bytes": 55093483, "num_examples": 1026747}, {"name": "test", "num_bytes": 60175255, "num_examples": 1054646}], "download_size": 108417116, "dataset_size": 173071709}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-21T20:19:47+00:00 | [] | [
"en"
] | TAGS
#language-English #region-us
|
# BabyLM 10M
This curated dataset is originally from the BabyLM Challenge.
It consists of ~10M words of mixed domain, consisting of the following sources:
- CHILDES (child-directed speech)
- Subtitles (speech)
- BNC (speech)
- TED talks (speech)
- children's books (simple written language)
| [
"# BabyLM 10M\n\nThis curated dataset is originally from the BabyLM Challenge.\n\nIt consists of ~10M words of mixed domain, consisting of the following sources:\n- CHILDES (child-directed speech)\n- Subtitles (speech)\n- BNC (speech)\n- TED talks (speech)\n- children's books (simple written language)"
] | [
"TAGS\n#language-English #region-us \n",
"# BabyLM 10M\n\nThis curated dataset is originally from the BabyLM Challenge.\n\nIt consists of ~10M words of mixed domain, consisting of the following sources:\n- CHILDES (child-directed speech)\n- Subtitles (speech)\n- BNC (speech)\n- TED talks (speech)\n- children's books (simple written language)"
] |
8dd129bfe503302ce4aeca277255f525839f40cf |
# Dataset Card for Evaluation run of binbi/SF-72B-V1.8.6-V1.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [binbi/SF-72B-V1.8.6-V1.2](https://huggingface.co/binbi/SF-72B-V1.8.6-V1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_binbi__SF-72B-V1.8.6-V1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T20:26:22.258506](https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__SF-72B-V1.8.6-V1.2/blob/main/results_2024-01-21T20-26-22.258506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2312183583064229,
"acc_stderr": 0.029963667974972664,
"acc_norm": 0.2311618522242625,
"acc_norm_stderr": 0.030751973434955327,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731603,
"mc2": 0.4877798130299791,
"mc2_stderr": 0.016318959342538
},
"harness|arc:challenge|25": {
"acc": 0.2235494880546075,
"acc_stderr": 0.012174896631202605,
"acc_norm": 0.2627986348122867,
"acc_norm_stderr": 0.012862523175351333
},
"harness|hellaswag|10": {
"acc": 0.25801633140808605,
"acc_stderr": 0.004366488167386393,
"acc_norm": 0.24865564628560047,
"acc_norm_stderr": 0.004313503876346078
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.02590789712240817,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.02590789712240817
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371376,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371376
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.02910522083322462,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.02910522083322462
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21711366538952745,
"acc_stderr": 0.014743125394823295,
"acc_norm": 0.21711366538952745,
"acc_norm_stderr": 0.014743125394823295
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.01755581809132226,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.01755581809132226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.032467217651178264,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.032467217651178264
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731603,
"mc2": 0.4877798130299791,
"mc2_stderr": 0.016318959342538
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076906
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_binbi__SF-72B-V1.8.6-V1.2 | [
"region:us"
] | 2024-01-21T20:15:07+00:00 | {"pretty_name": "Evaluation run of binbi/SF-72B-V1.8.6-V1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [binbi/SF-72B-V1.8.6-V1.2](https://huggingface.co/binbi/SF-72B-V1.8.6-V1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_binbi__SF-72B-V1.8.6-V1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T20:26:22.258506](https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__SF-72B-V1.8.6-V1.2/blob/main/results_2024-01-21T20-26-22.258506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2312183583064229,\n \"acc_stderr\": 0.029963667974972664,\n \"acc_norm\": 0.2311618522242625,\n \"acc_norm_stderr\": 0.030751973434955327,\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731603,\n \"mc2\": 0.4877798130299791,\n \"mc2_stderr\": 0.016318959342538\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2235494880546075,\n \"acc_stderr\": 0.012174896631202605,\n \"acc_norm\": 0.2627986348122867,\n \"acc_norm_stderr\": 0.012862523175351333\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25801633140808605,\n \"acc_stderr\": 0.004366488167386393,\n \"acc_norm\": 0.24865564628560047,\n \"acc_norm_stderr\": 0.004313503876346078\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.02590789712240817,\n \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.02590789712240817\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371376,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371376\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n \"acc_stderr\": 0.02910522083322462,\n \"acc_norm\": 0.25112107623318386,\n \"acc_norm_stderr\": 0.02910522083322462\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21711366538952745,\n \"acc_stderr\": 0.014743125394823295,\n \"acc_norm\": 0.21711366538952745,\n \"acc_norm_stderr\": 0.014743125394823295\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132226,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132226\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.032467217651178264,\n \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.032467217651178264\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731603,\n \"mc2\": 0.4877798130299791,\n \"mc2_stderr\": 0.016318959342538\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076906\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/binbi/SF-72B-V1.8.6-V1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|arc:challenge|25_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|arc:challenge|25_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|gsm8k|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|gsm8k|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hellaswag|10_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hellaswag|10_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T20-13-01.457531.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T20-26-22.258506.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["**/details_harness|winogrande|5_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["**/details_harness|winogrande|5_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T20-26-22.258506.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T20_13_01.457531", "path": ["results_2024-01-21T20-13-01.457531.parquet"]}, {"split": "2024_01_21T20_26_22.258506", "path": ["results_2024-01-21T20-26-22.258506.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T20-26-22.258506.parquet"]}]}]} | 2024-01-21T20:28:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of binbi/SF-72B-V1.8.6-V1.2
Dataset automatically created during the evaluation run of model binbi/SF-72B-V1.8.6-V1.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T20:26:22.258506(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of binbi/SF-72B-V1.8.6-V1.2\n\n\n\nDataset automatically created during the evaluation run of model binbi/SF-72B-V1.8.6-V1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T20:26:22.258506(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of binbi/SF-72B-V1.8.6-V1.2\n\n\n\nDataset automatically created during the evaluation run of model binbi/SF-72B-V1.8.6-V1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T20:26:22.258506(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
20edf611576c5217e57cf1423c8607c153ca3ced |
# Dataset of satyr (Granblue Fantasy)
This is the dataset of satyr (Granblue Fantasy), containing 83 images and their tags.
The core tags of this character are `long_hair, breasts, animal_ears, purple_hair, twintails, very_long_hair, purple_eyes, bangs, large_breasts, low_twintails, ribbon, hair_ribbon, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 83 | 143.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satyr_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 83 | 80.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satyr_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 209 | 175.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satyr_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 83 | 128.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satyr_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 209 | 256.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satyr_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/satyr_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, dress, open_mouth, simple_background, blush, star_(symbol), white_background, :d, skirt, black_capelet, fur_trim |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, black_gloves, looking_at_viewer, smile, solo, cleavage, simple_background, star_(symbol), white_background, garter_straps, open_mouth, black_thighhighs, earrings, full_body, microphone, mismatched_gloves, one_eye_closed, single_elbow_glove |
| 2 | 6 |  |  |  |  |  | 1girl, blue_sky, cloud, day, looking_at_viewer, navel, outdoors, smile, solo, cleavage, hair_flower, open_mouth, bikini_skirt, black_bikini, blush, ocean, star_(symbol), water, ;d, belt, black_skirt, bow, collarbone, frilled_bikini, one_eye_closed, standing, sunlight, wading |
| 3 | 6 |  |  |  |  |  | 1girl, blush, futanari, open_mouth, solo, erection, tongue_out, heart, large_penis, testicles, censored, cleavage, ejaculation, looking_at_viewer, nipples, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | cleavage | dress | open_mouth | simple_background | blush | star_(symbol) | white_background | :d | skirt | black_capelet | fur_trim | bare_shoulders | black_dress | black_gloves | smile | garter_straps | black_thighhighs | earrings | full_body | microphone | mismatched_gloves | one_eye_closed | single_elbow_glove | blue_sky | cloud | day | navel | outdoors | hair_flower | bikini_skirt | black_bikini | ocean | water | ;d | belt | black_skirt | bow | collarbone | frilled_bikini | standing | sunlight | wading | futanari | erection | tongue_out | heart | large_penis | testicles | censored | ejaculation | nipples | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------|:--------|:-------------|:--------------------|:--------|:----------------|:-------------------|:-----|:--------|:----------------|:-----------|:-----------------|:--------------|:---------------|:--------|:----------------|:-------------------|:-----------|:------------|:-------------|:--------------------|:-----------------|:---------------------|:-----------|:--------|:------|:--------|:-----------|:--------------|:---------------|:---------------|:--------|:--------|:-----|:-------|:--------------|:------|:-------------|:-----------------|:-----------|:-----------|:---------|:-----------|:-----------|:-------------|:--------|:--------------|:------------|:-----------|:--------------|:----------|:--------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | X | X | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | X | | X | X | | | | | | | | | X | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/satyr_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T20:27:39+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:47:22+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of satyr (Granblue Fantasy)
===================================
This is the dataset of satyr (Granblue Fantasy), containing 83 images and their tags.
The core tags of this character are 'long\_hair, breasts, animal\_ears, purple\_hair, twintails, very\_long\_hair, purple\_eyes, bangs, large\_breasts, low\_twintails, ribbon, hair\_ribbon, hair\_between\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
800803bab9fc1cbcc6680e0e9969668fee585549 |
# Dataset of godguard_brodia (Granblue Fantasy)
This is the dataset of godguard_brodia (Granblue Fantasy), containing 226 images and their tags.
The core tags of this character are `red_hair, long_hair, breasts, blue_eyes, hair_ornament, hair_between_eyes, very_long_hair, bangs, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 226 | 337.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/godguard_brodia_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 226 | 195.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/godguard_brodia_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 544 | 403.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/godguard_brodia_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 226 | 297.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/godguard_brodia_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 544 | 565.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/godguard_brodia_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/godguard_brodia_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, bare_shoulders, gauntlets, solo, thighhighs, cleavage, boots, sword, looking_at_viewer, thighs, armor, gloves, white_background, white_skirt |
| 1 | 6 |  |  |  |  |  | 1girl, armored_boots, bare_shoulders, gauntlets, looking_at_viewer, pleated_skirt, solo, medium_breasts, thighhighs, white_background, belt, full_body, standing, sword, zettai_ryouiki, holding, simple_background |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, white_dress, white_gloves, closed_mouth, elbow_gloves, hair_flower, smile, holding_sword, medium_breasts, blush, collarbone, full_body, high_heels, petals, shiny_hair, simple_background, sleeveless_dress, standing, thighs, white_background, white_footwear |
| 3 | 14 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, looking_at_viewer, solo, blush, feather_hair_ornament, thighs, white_bikini, navel, layered_bikini, white_skirt, closed_mouth, collarbone, smile, highleg_bikini, miniskirt, black_bikini, blue_sky, day, wrist_scrunchie |
| 4 | 6 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, playboy_bunny, rabbit_ears, solo, detached_collar, blush, cleavage, fake_animal_ears, highleg_leotard, wrist_cuffs, black_pantyhose, open_mouth, simple_background, thighhighs, thighs, white_leotard |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | gauntlets | solo | thighhighs | cleavage | boots | sword | looking_at_viewer | thighs | armor | gloves | white_background | white_skirt | armored_boots | pleated_skirt | medium_breasts | belt | full_body | standing | zettai_ryouiki | holding | simple_background | white_dress | white_gloves | closed_mouth | elbow_gloves | hair_flower | smile | holding_sword | blush | collarbone | high_heels | petals | shiny_hair | sleeveless_dress | white_footwear | feather_hair_ornament | white_bikini | navel | layered_bikini | highleg_bikini | miniskirt | black_bikini | blue_sky | day | wrist_scrunchie | playboy_bunny | rabbit_ears | detached_collar | fake_animal_ears | highleg_leotard | wrist_cuffs | black_pantyhose | open_mouth | white_leotard |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:------------|:-------|:-------------|:-----------|:--------|:--------|:--------------------|:---------|:--------|:---------|:-------------------|:--------------|:----------------|:----------------|:-----------------|:-------|:------------|:-----------|:-----------------|:----------|:--------------------|:--------------|:---------------|:---------------|:---------------|:--------------|:--------|:----------------|:--------|:-------------|:-------------|:---------|:-------------|:-------------------|:-----------------|:------------------------|:---------------|:--------|:-----------------|:-----------------|:------------|:---------------|:-----------|:------|:------------------|:----------------|:--------------|:------------------|:-------------------|:------------------|:--------------|:------------------|:-------------|:----------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | | | X | X | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | | | | | X | X | | | X | | | | X | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | X | | X | | X | | | X | X | | | | X | | | | | | | | | | | | X | | | X | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | X | X | X | | | X | X | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/godguard_brodia_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T20:27:40+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T21:14:18+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of godguard\_brodia (Granblue Fantasy)
==============================================
This is the dataset of godguard\_brodia (Granblue Fantasy), containing 226 images and their tags.
The core tags of this character are 'red\_hair, long\_hair, breasts, blue\_eyes, hair\_ornament, hair\_between\_eyes, very\_long\_hair, bangs, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
f8f7d9b8f9d659da7a6a10bcf4a8d23731e74fbe |
# Dataset of catherine (Granblue Fantasy)
This is the dataset of catherine (Granblue Fantasy), containing 35 images and their tags.
The core tags of this character are `animal_ears, long_hair, pink_hair, breasts, pink_eyes, hat, large_breasts, mini_hat, top_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 35 | 35.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catherine_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 35 | 24.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catherine_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 67 | 45.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catherine_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 35 | 34.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catherine_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 67 | 59.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catherine_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/catherine_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, erune, glasses, cleavage, garter_straps, smile, tray, bare_shoulders, solo, alternate_costume, black_thighhighs, cup, holding, looking_at_viewer, ponytail, red_eyes, simple_background, skirt, teapot, green_apron, hand_on_hip, medium_breasts, very_long_hair |
| 1 | 8 |  |  |  |  |  | 1girl, erune, solo, looking_at_viewer, smile, thighhighs, handgun, black_gloves, mini_top_hat, hairband, holding_gun, holster, blush, full_body, leotard, red_eyes, simple_background, sitting |
| 2 | 6 |  |  |  |  |  | 1girl, erune, looking_at_viewer, simple_background, smile, solo, white_background, bangs, black_gloves, cleavage, elbow_gloves, hairband, leotard, mini_top_hat, parted_lips |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | erune | glasses | cleavage | garter_straps | smile | tray | bare_shoulders | solo | alternate_costume | black_thighhighs | cup | holding | looking_at_viewer | ponytail | red_eyes | simple_background | skirt | teapot | green_apron | hand_on_hip | medium_breasts | very_long_hair | thighhighs | handgun | black_gloves | mini_top_hat | hairband | holding_gun | holster | blush | full_body | leotard | sitting | white_background | bangs | elbow_gloves | parted_lips |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:----------|:-----------|:----------------|:--------|:-------|:-----------------|:-------|:--------------------|:-------------------|:------|:----------|:--------------------|:-----------|:-----------|:--------------------|:--------|:---------|:--------------|:--------------|:-----------------|:-----------------|:-------------|:----------|:---------------|:---------------|:-----------|:--------------|:----------|:--------|:------------|:----------|:----------|:-------------------|:--------|:---------------|:--------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | | | | X | | | X | | | | | X | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 2 | 6 |  |  |  |  |  | X | X | | X | | X | | | X | | | | | X | | | X | | | | | | | | | X | X | X | | | | | X | | X | X | X | X |
| CyberHarem/catherine_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T20:27:57+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:33:42+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of catherine (Granblue Fantasy)
=======================================
This is the dataset of catherine (Granblue Fantasy), containing 35 images and their tags.
The core tags of this character are 'animal\_ears, long\_hair, pink\_hair, breasts, pink\_eyes, hat, large\_breasts, mini\_hat, top\_hat', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
7cdf9061e6e60849d0a10efe9d117282b0713501 |
# Dataset of tikoh (Granblue Fantasy)
This is the dataset of tikoh (Granblue Fantasy), containing 75 images and their tags.
The core tags of this character are `animal_ears, bangs, breasts, long_hair, hair_ornament, blue_hair, purple_eyes, medium_breasts, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 75 | 122.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tikoh_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 75 | 66.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tikoh_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 192 | 147.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tikoh_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 75 | 108.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tikoh_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 192 | 208.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tikoh_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tikoh_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, erune, holding_syringe, looking_at_viewer, nurse_cap, open_mouth, smile, solo, white_dress, white_gloves, bandages, belt, eyepatch, hair_over_one_eye, one_eye_covered, purple_hair, short_dress, x_hair_ornament, bare_shoulders, large_breasts, blood, high_heels, short_sleeves, shoulder_bag, simple_background, sitting, sleeveless, thick_thighs, thigh_strap, white_background, white_headwear |
| 1 | 15 |  |  |  |  |  | 1girl, erune, solo, looking_at_viewer, white_gloves, cleavage, thighs, black_thighhighs, white_dress, white_background, white_headwear, holding_staff, short_dress, bare_shoulders, blush, boots, simple_background |
| 2 | 5 |  |  |  |  |  | 1girl, erune, looking_at_viewer, solo, white_bikini, blush, hairclip, navel, thigh_strap, blunt_bangs, bracelet, collarbone, open_mouth, simple_background, sitting, thighs, white_background, yellow_eyes |
| 3 | 5 |  |  |  |  |  | 1girl, bikini, erune, looking_at_viewer, navel, solo, bare_shoulders, blue_sky, blush, cloud, collarbone, day, ocean, outdoors, water, closed_mouth, sun_hat, thigh_strap, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | erune | holding_syringe | looking_at_viewer | nurse_cap | open_mouth | smile | solo | white_dress | white_gloves | bandages | belt | eyepatch | hair_over_one_eye | one_eye_covered | purple_hair | short_dress | x_hair_ornament | bare_shoulders | large_breasts | blood | high_heels | short_sleeves | shoulder_bag | simple_background | sitting | sleeveless | thick_thighs | thigh_strap | white_background | white_headwear | cleavage | thighs | black_thighhighs | holding_staff | blush | boots | white_bikini | hairclip | navel | blunt_bangs | bracelet | collarbone | yellow_eyes | bikini | blue_sky | cloud | day | ocean | outdoors | water | closed_mouth | sun_hat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:------------------|:--------------------|:------------|:-------------|:--------|:-------|:--------------|:---------------|:-----------|:-------|:-----------|:--------------------|:------------------|:--------------|:--------------|:------------------|:-----------------|:----------------|:--------|:-------------|:----------------|:---------------|:--------------------|:----------|:-------------|:---------------|:--------------|:-------------------|:-----------------|:-----------|:---------|:-------------------|:----------------|:--------|:--------|:---------------|:-----------|:--------|:--------------|:-----------|:-------------|:--------------|:---------|:-----------|:--------|:------|:--------|:-----------|:--------|:---------------|:----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | | X | | | | X | X | X | | | | | | | X | | X | | | | | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | | X | | X | | | | | | | | | | | | | | | | | X | X | | | X | X | | | X | | | X | | X | X | X | X | X | X | X | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | | | | X | | | | | | | | | | | X | | | | | | | | | | X | | | | X | | | X | | | | X | | | X | | X | X | X | X | X | X | X | X | X |
| CyberHarem/tikoh_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T20:35:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:54:29+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of tikoh (Granblue Fantasy)
===================================
This is the dataset of tikoh (Granblue Fantasy), containing 75 images and their tags.
The core tags of this character are 'animal\_ears, bangs, breasts, long\_hair, hair\_ornament, blue\_hair, purple\_eyes, medium\_breasts, hat', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
03e245667d18715d9ceb1e85005ef2c575e30747 |
# Dataset of lich (Granblue Fantasy)
This is the dataset of lich (Granblue Fantasy), containing 51 images and their tags.
The core tags of this character are `purple_eyes, black_hair, bags_under_eyes, bangs, breasts, hair_between_eyes, small_breasts, hair_over_one_eye`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 51 | 74.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lich_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 51 | 44.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lich_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 112 | 88.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lich_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 51 | 66.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lich_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 112 | 123.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lich_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lich_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, hood_up, jewelry, looking_at_viewer, sharp_teeth, solo, white_background, grin, simple_background, upper_body, bridal_gauntlets, crown, ghost, medium_hair, sharp_fingernails |
| 1 | 6 |  |  |  |  |  | 1girl, bracelet, bridal_gauntlets, looking_at_viewer, sharp_fingernails, sharp_teeth, solo, long_fingernails, purple_nails, smile, dress, simple_background, white_background, glowing, hood, medium_hair, open_mouth |
| 2 | 9 |  |  |  |  |  | 1girl, long_sleeves, sleeves_past_fingers, solo, dress, hood_up, looking_at_viewer, ghost, hair_ornament, hooded_robe, smile, white_robe, sharp_teeth, halloween, open_mouth, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hood_up | jewelry | looking_at_viewer | sharp_teeth | solo | white_background | grin | simple_background | upper_body | bridal_gauntlets | crown | ghost | medium_hair | sharp_fingernails | bracelet | long_fingernails | purple_nails | smile | dress | glowing | hood | open_mouth | long_sleeves | sleeves_past_fingers | hair_ornament | hooded_robe | white_robe | halloween |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:----------|:--------------------|:--------------|:-------|:-------------------|:-------|:--------------------|:-------------|:-------------------|:--------|:--------|:--------------|:--------------------|:-----------|:-------------------|:---------------|:--------|:--------|:----------|:-------|:-------------|:---------------|:-----------------------|:----------------|:--------------|:-------------|:------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | | X | X | X | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | | X | X | X | | | | X | | | X | | | | | | X | X | | | X | X | X | X | X | X | X |
| CyberHarem/lich_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T20:35:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:45:09+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of lich (Granblue Fantasy)
==================================
This is the dataset of lich (Granblue Fantasy), containing 51 images and their tags.
The core tags of this character are 'purple\_eyes, black\_hair, bags\_under\_eyes, bangs, breasts, hair\_between\_eyes, small\_breasts, hair\_over\_one\_eye', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
f1f6a4b8e37eb55ba825f6a43b35216e7924cb49 |
# Dataset of haaselia (Granblue Fantasy)
This is the dataset of haaselia (Granblue Fantasy), containing 51 images and their tags.
The core tags of this character are `brown_hair, pointy_ears, long_hair, earrings, braid, brown_eyes, single_braid, bangs, hoop_earrings, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 51 | 62.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haaselia_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 51 | 33.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haaselia_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 112 | 71.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haaselia_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 51 | 54.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haaselia_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 112 | 105.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haaselia_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/haaselia_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 51 |  |  |  |  |  | 1girl, jewelry, solo, harvin, looking_at_viewer, smile, long_sleeves, blush, dress, closed_mouth, white_background, simple_background, wide_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jewelry | solo | harvin | looking_at_viewer | smile | long_sleeves | blush | dress | closed_mouth | white_background | simple_background | wide_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------|:---------|:--------------------|:--------|:---------------|:--------|:--------|:---------------|:-------------------|:--------------------|:---------------|
| 0 | 51 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/haaselia_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T20:35:41+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:45:52+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of haaselia (Granblue Fantasy)
======================================
This is the dataset of haaselia (Granblue Fantasy), containing 51 images and their tags.
The core tags of this character are 'brown\_hair, pointy\_ears, long\_hair, earrings, braid, brown\_eyes, single\_braid, bangs, hoop\_earrings, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
098b883cde21b2e43e9c54b45a8e078750c15e63 |
# Dataset Card for Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca](https://huggingface.co/robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T20:50:11.424256](https://huggingface.co/datasets/open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpaca/blob/main/results_2024-01-21T20-50-11.424256.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5896401861614491,
"acc_stderr": 0.03346170298483195,
"acc_norm": 0.5943166762731281,
"acc_norm_stderr": 0.03415179201309978,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5586443323081062,
"mc2_stderr": 0.015323579243350587
},
"harness|arc:challenge|25": {
"acc": 0.5221843003412969,
"acc_stderr": 0.014597001927076135,
"acc_norm": 0.5674061433447098,
"acc_norm_stderr": 0.014478005694182524
},
"harness|hellaswag|10": {
"acc": 0.6096395140410277,
"acc_stderr": 0.00486834105656622,
"acc_norm": 0.8082055367456682,
"acc_norm_stderr": 0.00392907627647338
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562417,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667768,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667768
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153317,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153317
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.03201650100739611,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.03201650100739611
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803064,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.02845882099146031,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.02845882099146031
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823302,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895806,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895806
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.016251139711570772,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.016251139711570772
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615697,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615697
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.026730620728004903,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.026730620728004903
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.02628973494595293,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.02628973494595293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729146,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.01261820406658839,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.01261820406658839
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.030105636570016636,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.030105636570016636
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5996732026143791,
"acc_stderr": 0.019821843688271754,
"acc_norm": 0.5996732026143791,
"acc_norm_stderr": 0.019821843688271754
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014635,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014635
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5586443323081062,
"mc2_stderr": 0.015323579243350587
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.01180736022402539
},
"harness|gsm8k|5": {
"acc": 0.37604245640636846,
"acc_stderr": 0.013342532064849772
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpaca | [
"region:us"
] | 2024-01-21T20:47:01+00:00 | {"pretty_name": "Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca", "dataset_summary": "Dataset automatically created during the evaluation run of model [robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca](https://huggingface.co/robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpaca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T20:50:11.424256](https://huggingface.co/datasets/open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpaca/blob/main/results_2024-01-21T20-50-11.424256.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5896401861614491,\n \"acc_stderr\": 0.03346170298483195,\n \"acc_norm\": 0.5943166762731281,\n \"acc_norm_stderr\": 0.03415179201309978,\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5586443323081062,\n \"mc2_stderr\": 0.015323579243350587\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5221843003412969,\n \"acc_stderr\": 0.014597001927076135,\n \"acc_norm\": 0.5674061433447098,\n \"acc_norm_stderr\": 0.014478005694182524\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6096395140410277,\n \"acc_stderr\": 0.00486834105656622,\n \"acc_norm\": 0.8082055367456682,\n \"acc_norm_stderr\": 0.00392907627647338\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562417,\n \"acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n \"acc_stderr\": 0.026522709674667768,\n \"acc_norm\": 0.6806451612903226,\n \"acc_norm_stderr\": 0.026522709674667768\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153317,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153317\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.03201650100739611,\n \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.03201650100739611\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803064,\n \"acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803064\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.02845882099146031,\n \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.02845882099146031\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n \"acc_stderr\": 0.014743125394823302,\n \"acc_norm\": 0.7828863346104725,\n \"acc_norm_stderr\": 0.014743125394823302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895806,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895806\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n \"acc_stderr\": 0.016251139711570772,\n \"acc_norm\": 0.38212290502793295,\n \"acc_norm_stderr\": 0.016251139711570772\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615697,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615697\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n \"acc_stderr\": 0.026730620728004903,\n \"acc_norm\": 0.6688102893890675,\n \"acc_norm_stderr\": 0.026730620728004903\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729146,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729146\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.01261820406658839,\n \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.01261820406658839\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.030105636570016636,\n \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.030105636570016636\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5996732026143791,\n \"acc_stderr\": 0.019821843688271754,\n \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.019821843688271754\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n \"acc_stderr\": 0.030360490154014635,\n \"acc_norm\": 0.7562189054726368,\n \"acc_norm_stderr\": 0.030360490154014635\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5586443323081062,\n \"mc2_stderr\": 0.015323579243350587\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.01180736022402539\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37604245640636846,\n \"acc_stderr\": 0.013342532064849772\n }\n}\n```", "repo_url": "https://huggingface.co/robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|arc:challenge|25_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|arc:challenge|25_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|gsm8k|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|gsm8k|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hellaswag|10_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hellaswag|10_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T20-44-48.007674.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T20-50-11.424256.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["**/details_harness|winogrande|5_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["**/details_harness|winogrande|5_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T20-50-11.424256.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T20_44_48.007674", "path": ["results_2024-01-21T20-44-48.007674.parquet"]}, {"split": "2024_01_21T20_50_11.424256", "path": ["results_2024-01-21T20-50-11.424256.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T20-50-11.424256.parquet"]}]}]} | 2024-01-21T20:52:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca
Dataset automatically created during the evaluation run of model robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T20:50:11.424256(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca\n\n\n\nDataset automatically created during the evaluation run of model robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T20:50:11.424256(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca\n\n\n\nDataset automatically created during the evaluation run of model robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T20:50:11.424256(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
ecf63b7c9631452289cdc103ddf35206acbda40d |
# Dataset of shatola (Granblue Fantasy)
This is the dataset of shatola (Granblue Fantasy), containing 367 images and their tags.
The core tags of this character are `long_hair, animal_ears, blue_hair, breasts, horns, cow_ears, bangs, cow_horns, cow_girl, large_breasts, pointy_ears, ear_piercing`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 367 | 580.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shatola_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 367 | 308.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shatola_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 951 | 691.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shatola_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 367 | 502.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shatola_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 951 | 1.00 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shatola_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shatola_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, draph, looking_at_viewer, red_dress, solo, blush, cleavage, smile, bell, bare_shoulders, open_mouth, bow, fur_collar, twintails, yellow_eyes, cow_print |
| 1 | 21 |  |  |  |  |  | 1girl, cleavage, cow_print, draph, looking_at_viewer, piercing, solo, bare_shoulders, blush, detached_sleeves, white_bikini, detached_collar, see-through, wide_sleeves, open_mouth, purple_eyes |
| 2 | 8 |  |  |  |  |  | 1girl, blush, cleavage, cow_print, detached_sleeves, draph, looking_at_viewer, piercing, solo, thighs, white_bikini, white_thighhighs, bare_shoulders, detached_collar, navel, see-through, short_shorts, sitting, white_shorts, wide_sleeves, purple_eyes, open_mouth, white_background, simple_background |
| 3 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, cow_print, cow_tail, detached_collar, detached_sleeves, draph, looking_at_viewer, navel, piercing, short_shorts, solo, thighs, white_bikini, white_shorts, white_thighhighs, wide_sleeves, blush, micro_shorts, open_mouth |
| 4 | 9 |  |  |  |  |  | 1boy, 1girl, blush, cow_print, draph, hetero, nipples, paizuri, solo_focus, penis, piercing, earrings, looking_at_viewer, open_mouth, collarbone, detached_collar, bar_censor, detached_sleeves, huge_breasts, mosaic_censoring, purple_eyes, smile |
| 5 | 12 |  |  |  |  |  | 1girl, blush, cow_print, hetero, penis, sex, vaginal, 1boy, draph, navel, open_mouth, solo_focus, nipples, thighhighs, piercing, cum_in_pussy, girl_on_top, nude, bare_shoulders, cowgirl_position, detached_sleeves, looking_at_viewer, mosaic_censoring, bar_censor, smile, thighs |
| 6 | 5 |  |  |  |  |  | blush, draph, looking_at_viewer, onsen, 1girl, collarbone, night_sky, solo, towel_on_head, wet, naked_towel, open_mouth, sitting, smile, star_(sky), bare_shoulders, bathing, cleavage, completely_nude, huge_breasts, navel, nude_cover, steam_censor |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | draph | looking_at_viewer | red_dress | solo | blush | cleavage | smile | bell | bare_shoulders | open_mouth | bow | fur_collar | twintails | yellow_eyes | cow_print | piercing | detached_sleeves | white_bikini | detached_collar | see-through | wide_sleeves | purple_eyes | thighs | white_thighhighs | navel | short_shorts | sitting | white_shorts | white_background | simple_background | cow_tail | micro_shorts | 1boy | hetero | nipples | paizuri | solo_focus | penis | earrings | collarbone | bar_censor | huge_breasts | mosaic_censoring | sex | vaginal | thighhighs | cum_in_pussy | girl_on_top | nude | cowgirl_position | onsen | night_sky | towel_on_head | wet | naked_towel | star_(sky) | bathing | completely_nude | nude_cover | steam_censor |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:------------|:-------|:--------|:-----------|:--------|:-------|:-----------------|:-------------|:------|:-------------|:------------|:--------------|:------------|:-----------|:-------------------|:---------------|:------------------|:--------------|:---------------|:--------------|:---------|:-------------------|:--------|:---------------|:----------|:---------------|:-------------------|:--------------------|:-----------|:---------------|:-------|:---------|:----------|:----------|:-------------|:--------|:-----------|:-------------|:-------------|:---------------|:-------------------|:------|:----------|:-------------|:---------------|:--------------|:-------|:-------------------|:--------|:------------|:----------------|:------|:--------------|:-------------|:----------|:------------------|:-------------|:---------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 21 |  |  |  |  |  | X | X | X | | X | X | X | | | X | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | | X | X | X | | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | | X | X | X | | | X | X | | | | | X | X | X | X | X | | X | | X | X | X | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | X | | | X | | X | | | X | | | | | X | X | X | | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | X | X | | | X | | X | | X | X | | | | | X | X | X | | | | | | X | | X | | | | | | | | X | X | X | | X | X | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | | X | X | X | X | | X | X | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/shatola_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T20:48:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T22:19:59+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of shatola (Granblue Fantasy)
=====================================
This is the dataset of shatola (Granblue Fantasy), containing 367 images and their tags.
The core tags of this character are 'long\_hair, animal\_ears, blue\_hair, breasts, horns, cow\_ears, bangs, cow\_horns, cow\_girl, large\_breasts, pointy\_ears, ear\_piercing', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
d82cb54aeb7e9791db264aa970c2ca9994295561 |
# Dataset of claudia (Granblue Fantasy)
This is the dataset of claudia (Granblue Fantasy), containing 19 images and their tags.
The core tags of this character are `maid_headdress, black_hair, single_hair_bun, bangs, hair_bun, short_hair, blunt_bangs, braid, blue_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 18.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/claudia_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 12.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/claudia_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 37 | 19.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/claudia_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 16.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/claudia_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 37 | 24.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/claudia_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/claudia_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, solo, dress, gauntlets, puffy_sleeves, frills, long_sleeves, maid_apron, backless_outfit, looking_back, simple_background, holding, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | dress | gauntlets | puffy_sleeves | frills | long_sleeves | maid_apron | backless_outfit | looking_back | simple_background | holding | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:------------|:----------------|:---------|:---------------|:-------------|:------------------|:---------------|:--------------------|:----------|:-------------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/claudia_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T20:48:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T20:53:44+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of claudia (Granblue Fantasy)
=====================================
This is the dataset of claudia (Granblue Fantasy), containing 19 images and their tags.
The core tags of this character are 'maid\_headdress, black\_hair, single\_hair\_bun, bangs, hair\_bun, short\_hair, blunt\_bangs, braid, blue\_eyes, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
23a617758d91b5b7a933280ed9a741c0bc7649e6 |
# Dataset Card for Evaluation run of RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1](https://huggingface.co/RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T20:50:04.239190](https://huggingface.co/datasets/open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.1/blob/main/results_2024-01-21T20-50-04.239190.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6547917128807956,
"acc_stderr": 0.03202333509712328,
"acc_norm": 0.6531759639050361,
"acc_norm_stderr": 0.03271497660010414,
"mc1": 0.616891064871481,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.745668282129291,
"mc2_stderr": 0.014357887130424447
},
"harness|arc:challenge|25": {
"acc": 0.7141638225255973,
"acc_stderr": 0.013203196088537372,
"acc_norm": 0.7448805460750854,
"acc_norm_stderr": 0.012739038695202091
},
"harness|hellaswag|10": {
"acc": 0.7266480780720972,
"acc_stderr": 0.004447691405592719,
"acc_norm": 0.8906592312288388,
"acc_norm_stderr": 0.0031142850772280287
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337124,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337124
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590167,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4592178770949721,
"acc_stderr": 0.016666783616525776,
"acc_norm": 0.4592178770949721,
"acc_norm_stderr": 0.016666783616525776
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598566,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598566
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.616891064871481,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.745668282129291,
"mc2_stderr": 0.014357887130424447
},
"harness|winogrande|5": {
"acc": 0.8831886345698501,
"acc_stderr": 0.009027186879167794
},
"harness|gsm8k|5": {
"acc": 0.7164518574677786,
"acc_stderr": 0.012415070917508125
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.1 | [
"region:us"
] | 2024-01-21T20:49:38+00:00 | {"pretty_name": "Evaluation run of RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1](https://huggingface.co/RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T20:50:04.239190](https://huggingface.co/datasets/open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.1/blob/main/results_2024-01-21T20-50-04.239190.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6547917128807956,\n \"acc_stderr\": 0.03202333509712328,\n \"acc_norm\": 0.6531759639050361,\n \"acc_norm_stderr\": 0.03271497660010414,\n \"mc1\": 0.616891064871481,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.745668282129291,\n \"mc2_stderr\": 0.014357887130424447\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.013203196088537372,\n \"acc_norm\": 0.7448805460750854,\n \"acc_norm_stderr\": 0.012739038695202091\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7266480780720972,\n \"acc_stderr\": 0.004447691405592719,\n \"acc_norm\": 0.8906592312288388,\n \"acc_norm_stderr\": 0.0031142850772280287\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337124,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337124\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4592178770949721,\n \"acc_stderr\": 0.016666783616525776,\n \"acc_norm\": 0.4592178770949721,\n \"acc_norm_stderr\": 0.016666783616525776\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n \"acc_stderr\": 0.012729785386598566,\n \"acc_norm\": 0.4602346805736636,\n \"acc_norm_stderr\": 0.012729785386598566\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.616891064871481,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.745668282129291,\n \"mc2_stderr\": 0.014357887130424447\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8831886345698501,\n \"acc_stderr\": 0.009027186879167794\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7164518574677786,\n \"acc_stderr\": 0.012415070917508125\n }\n}\n```", "repo_url": "https://huggingface.co/RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|arc:challenge|25_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|arc:challenge|25_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|gsm8k|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|gsm8k|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hellaswag|10_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hellaswag|10_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T20-47-16.127941.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T20-50-04.239190.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["**/details_harness|winogrande|5_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["**/details_harness|winogrande|5_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T20-50-04.239190.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T20_47_16.127941", "path": ["results_2024-01-21T20-47-16.127941.parquet"]}, {"split": "2024_01_21T20_50_04.239190", "path": ["results_2024-01-21T20-50-04.239190.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T20-50-04.239190.parquet"]}]}]} | 2024-01-21T20:52:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1
Dataset automatically created during the evaluation run of model RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T20:50:04.239190(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T20:50:04.239190(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model RubielLabarta/LogoS-7Bx2-MoE-13B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T20:50:04.239190(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
afd7952f72e05df8c2dc949e7c05991703f03e28 |
This dataset was generated by reformatting [`coref-data/corefud_raw`](https://huggingface.co/datasets/coref-data/corefud_raw) into the indiscrim coreference format. See that repo for dataset details.
See [ianporada/coref-data](https://github.com/ianporada/coref-data) for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
| coref-data/corefud_indiscrim | [
"region:us"
] | 2024-01-21T20:50:54+00:00 | {"dataset_info": [{"config_name": "ca_ancora-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "float64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 38341803, "num_examples": 1011}, {"name": "validation", "num_bytes": 5660530, "num_examples": 131}], "download_size": 7906331, "dataset_size": 44002333}, {"config_name": "cs_pcedt-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "float64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 149583151, "num_examples": 1875}, {"name": "validation", "num_bytes": 26160516, "num_examples": 337}], "download_size": 31260936, "dataset_size": 175743667}, {"config_name": "cs_pdt-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "float64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 109542424, "num_examples": 2533}, {"name": "validation", "num_bytes": 14886840, "num_examples": 316}], "download_size": 23982751, "dataset_size": 124429264}, {"config_name": "de_parcorfull-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 1035732, "num_examples": 15}, {"name": "validation", "num_bytes": 132412, "num_examples": 2}], "download_size": 273217, "dataset_size": 1168144}, {"config_name": "de_potsdamcc-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 3999054, "num_examples": 142}, {"name": "validation", "num_bytes": 511557, "num_examples": 17}], "download_size": 859121, "dataset_size": 4510611}, {"config_name": "en_gum-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "float64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 17919310, "num_examples": 151}, {"name": "validation", "num_bytes": 2369056, "num_examples": 22}], "download_size": 4234788, "dataset_size": 20288366}, {"config_name": "en_parcorfull-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 899917, "num_examples": 15}, {"name": "validation", "num_bytes": 115587, "num_examples": 2}], "download_size": 259976, "dataset_size": 1015504}, {"config_name": "es_ancora-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "float64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 43242148, "num_examples": 1080}, {"name": "validation", "num_bytes": 5404400, "num_examples": 131}], "download_size": 8758107, "dataset_size": 48646548}, {"config_name": "fr_democrat-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "null"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 23704875, "num_examples": 50}, {"name": "validation", "num_bytes": 2914195, "num_examples": 46}], "download_size": 5011046, "dataset_size": 26619070}, {"config_name": "hu_korkor-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "float64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 2358029, "num_examples": 76}, {"name": "validation", "num_bytes": 305829, "num_examples": 9}], "download_size": 644899, "dataset_size": 2663858}, {"config_name": "hu_szegedkoref-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "float64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 11618556, "num_examples": 320}, {"name": "validation", "num_bytes": 1365657, "num_examples": 40}], "download_size": 2509790, "dataset_size": 12984213}, {"config_name": "lt_lcc-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 3908009, "num_examples": 80}, {"name": "validation", "num_bytes": 435994, "num_examples": 10}], "download_size": 802890, "dataset_size": 4344003}, {"config_name": "no_bokmaalnarc-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "null"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 21847333, "num_examples": 284}, {"name": "validation", "num_bytes": 2319889, "num_examples": 31}], "download_size": 4979662, "dataset_size": 24167222}, {"config_name": "no_nynorsknarc-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "null"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 18472313, "num_examples": 336}, {"name": "validation", "num_bytes": 1904614, "num_examples": 28}], "download_size": 4209149, "dataset_size": 20376927}, {"config_name": "pl_pcc-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "float64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 68325348, "num_examples": 1463}, {"name": "validation", "num_bytes": 8583039, "num_examples": 183}], "download_size": 14971275, "dataset_size": 76908387}, {"config_name": "ru_rucor-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "null"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 15595222, "num_examples": 145}, {"name": "validation", "num_bytes": 2685627, "num_examples": 18}], "download_size": 3651673, "dataset_size": 18280849}, {"config_name": "tr_itcc-corefud", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "null"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 5399055, "num_examples": 19}, {"name": "validation", "num_bytes": 599026, "num_examples": 2}], "download_size": 1158897, "dataset_size": 5998081}], "configs": [{"config_name": "ca_ancora-corefud", "data_files": [{"split": "train", "path": "ca_ancora-corefud/train-*"}, {"split": "validation", "path": "ca_ancora-corefud/validation-*"}]}, {"config_name": "cs_pcedt-corefud", "data_files": [{"split": "train", "path": "cs_pcedt-corefud/train-*"}, {"split": "validation", "path": "cs_pcedt-corefud/validation-*"}]}, {"config_name": "cs_pdt-corefud", "data_files": [{"split": "train", "path": "cs_pdt-corefud/train-*"}, {"split": "validation", "path": "cs_pdt-corefud/validation-*"}]}, {"config_name": "de_parcorfull-corefud", "data_files": [{"split": "train", "path": "de_parcorfull-corefud/train-*"}, {"split": "validation", "path": "de_parcorfull-corefud/validation-*"}]}, {"config_name": "de_potsdamcc-corefud", "data_files": [{"split": "train", "path": "de_potsdamcc-corefud/train-*"}, {"split": "validation", "path": "de_potsdamcc-corefud/validation-*"}]}, {"config_name": "en_gum-corefud", "data_files": [{"split": "train", "path": "en_gum-corefud/train-*"}, {"split": "validation", "path": "en_gum-corefud/validation-*"}]}, {"config_name": "en_parcorfull-corefud", "data_files": [{"split": "train", "path": "en_parcorfull-corefud/train-*"}, {"split": "validation", "path": "en_parcorfull-corefud/validation-*"}]}, {"config_name": "es_ancora-corefud", "data_files": [{"split": "train", "path": "es_ancora-corefud/train-*"}, {"split": "validation", "path": "es_ancora-corefud/validation-*"}]}, {"config_name": "fr_democrat-corefud", "data_files": [{"split": "train", "path": "fr_democrat-corefud/train-*"}, {"split": "validation", "path": "fr_democrat-corefud/validation-*"}]}, {"config_name": "hu_korkor-corefud", "data_files": [{"split": "train", "path": "hu_korkor-corefud/train-*"}, {"split": "validation", "path": "hu_korkor-corefud/validation-*"}]}, {"config_name": "hu_szegedkoref-corefud", "data_files": [{"split": "train", "path": "hu_szegedkoref-corefud/train-*"}, {"split": "validation", "path": "hu_szegedkoref-corefud/validation-*"}]}, {"config_name": "lt_lcc-corefud", "data_files": [{"split": "train", "path": "lt_lcc-corefud/train-*"}, {"split": "validation", "path": "lt_lcc-corefud/validation-*"}]}, {"config_name": "no_bokmaalnarc-corefud", "data_files": [{"split": "train", "path": "no_bokmaalnarc-corefud/train-*"}, {"split": "validation", "path": "no_bokmaalnarc-corefud/validation-*"}]}, {"config_name": "no_nynorsknarc-corefud", "data_files": [{"split": "train", "path": "no_nynorsknarc-corefud/train-*"}, {"split": "validation", "path": "no_nynorsknarc-corefud/validation-*"}]}, {"config_name": "pl_pcc-corefud", "data_files": [{"split": "train", "path": "pl_pcc-corefud/train-*"}, {"split": "validation", "path": "pl_pcc-corefud/validation-*"}]}, {"config_name": "ru_rucor-corefud", "data_files": [{"split": "train", "path": "ru_rucor-corefud/train-*"}, {"split": "validation", "path": "ru_rucor-corefud/validation-*"}]}, {"config_name": "tr_itcc-corefud", "data_files": [{"split": "train", "path": "tr_itcc-corefud/train-*"}, {"split": "validation", "path": "tr_itcc-corefud/validation-*"}]}]} | 2024-01-21T20:53:04+00:00 | [] | [] | TAGS
#region-us
|
This dataset was generated by reformatting 'coref-data/corefud_raw' into the indiscrim coreference format. See that repo for dataset details.
See ianporada/coref-data for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
| [] | [
"TAGS\n#region-us \n"
] |
19799fd96dca0de037f5862a021700ff50d18eb2 |
# Dataset Card for Evaluation run of openagi-project/OpenAGI-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [openagi-project/OpenAGI-7B-v0.1](https://huggingface.co/openagi-project/OpenAGI-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openagi-project__OpenAGI-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-26T17:39:46.193538](https://huggingface.co/datasets/open-llm-leaderboard/details_openagi-project__OpenAGI-7B-v0.1/blob/main/results_2024-01-26T17-39-46.193538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6376909717416875,
"acc_stderr": 0.032524006961840754,
"acc_norm": 0.6396432481543075,
"acc_norm_stderr": 0.033178412950361426,
"mc1": 0.5189718482252142,
"mc1_stderr": 0.017490896405762353,
"mc2": 0.6955285302922068,
"mc2_stderr": 0.015057872155924216
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175452,
"acc_norm": 0.6672354948805461,
"acc_norm_stderr": 0.013769863046192305
},
"harness|hellaswag|10": {
"acc": 0.6885082652857997,
"acc_stderr": 0.0046215681251020446,
"acc_norm": 0.8612826130252937,
"acc_norm_stderr": 0.003449449618650549
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406776,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406776
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03053289223393202,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03053289223393202
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187215,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187215
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.0165952597103993,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.0165952597103993
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562585,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562585
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368982,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278884,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.0264930332251459,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.0264930332251459
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900922,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729147,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532063,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532063
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988633,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5189718482252142,
"mc1_stderr": 0.017490896405762353,
"mc2": 0.6955285302922068,
"mc2_stderr": 0.015057872155924216
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.01135031570746206
},
"harness|gsm8k|5": {
"acc": 0.5663381349507203,
"acc_stderr": 0.013650728047064693
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_openagi-project__OpenAGI-7B-v0.1 | [
"region:us"
] | 2024-01-21T20:54:24+00:00 | {"pretty_name": "Evaluation run of openagi-project/OpenAGI-7B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [openagi-project/OpenAGI-7B-v0.1](https://huggingface.co/openagi-project/OpenAGI-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openagi-project__OpenAGI-7B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-26T17:39:46.193538](https://huggingface.co/datasets/open-llm-leaderboard/details_openagi-project__OpenAGI-7B-v0.1/blob/main/results_2024-01-26T17-39-46.193538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6376909717416875,\n \"acc_stderr\": 0.032524006961840754,\n \"acc_norm\": 0.6396432481543075,\n \"acc_norm_stderr\": 0.033178412950361426,\n \"mc1\": 0.5189718482252142,\n \"mc1_stderr\": 0.017490896405762353,\n \"mc2\": 0.6955285302922068,\n \"mc2_stderr\": 0.015057872155924216\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175452,\n \"acc_norm\": 0.6672354948805461,\n \"acc_norm_stderr\": 0.013769863046192305\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6885082652857997,\n \"acc_stderr\": 0.0046215681251020446,\n \"acc_norm\": 0.8612826130252937,\n \"acc_norm_stderr\": 0.003449449618650549\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099834,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099834\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406776,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406776\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03053289223393202,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03053289223393202\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187215,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187215\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.0165952597103993,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.0165952597103993\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368982,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368982\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n \"acc_stderr\": 0.016531170993278884,\n \"acc_norm\": 0.4245810055865922,\n \"acc_norm_stderr\": 0.016531170993278884\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.0264930332251459,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.0264930332251459\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900922,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900922\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729147,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729147\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n \"acc_stderr\": 0.012734923579532063,\n \"acc_norm\": 0.46284224250325945,\n \"acc_norm_stderr\": 0.012734923579532063\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5189718482252142,\n \"mc1_stderr\": 0.017490896405762353,\n \"mc2\": 0.6955285302922068,\n \"mc2_stderr\": 0.015057872155924216\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.01135031570746206\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5663381349507203,\n \"acc_stderr\": 0.013650728047064693\n }\n}\n```", "repo_url": "https://huggingface.co/openagi-project/OpenAGI-7B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|arc:challenge|25_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|arc:challenge|25_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|gsm8k|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|gsm8k|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hellaswag|10_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hellaswag|10_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T20-52-03.715367.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-26T17-39-46.193538.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["**/details_harness|winogrande|5_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["**/details_harness|winogrande|5_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-26T17-39-46.193538.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T20_52_03.715367", "path": ["results_2024-01-21T20-52-03.715367.parquet"]}, {"split": "2024_01_26T17_39_46.193538", "path": ["results_2024-01-26T17-39-46.193538.parquet"]}, {"split": "latest", "path": ["results_2024-01-26T17-39-46.193538.parquet"]}]}]} | 2024-01-26T17:42:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of openagi-project/OpenAGI-7B-v0.1
Dataset automatically created during the evaluation run of model openagi-project/OpenAGI-7B-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-26T17:39:46.193538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of openagi-project/OpenAGI-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model openagi-project/OpenAGI-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-26T17:39:46.193538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of openagi-project/OpenAGI-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model openagi-project/OpenAGI-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-26T17:39:46.193538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0fa2334a8149d051903876cca016c63f2ddfc1db | # Dataset Card for "Fonts-Individual-Letters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Leeps/Fonts-Individual-Letters | [
"region:us"
] | 2024-01-21T20:58:41+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 78370892.928, "num_examples": 62976}], "download_size": 65794011, "dataset_size": 78370892.928}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-21T20:58:54+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Fonts-Individual-Letters"
More Information needed | [
"# Dataset Card for \"Fonts-Individual-Letters\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Fonts-Individual-Letters\"\n\nMore Information needed"
] |
f41ce7f0b1556e1753fba6538c48db8171343880 |
# Dataset of herja (Granblue Fantasy)
This is the dataset of herja (Granblue Fantasy), containing 28 images and their tags.
The core tags of this character are `green_hair, long_hair, breasts, ponytail, hair_ribbon, ribbon, brown_eyes, large_breasts, bangs, red_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 28 | 37.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/herja_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 28 | 21.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/herja_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 63 | 43.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/herja_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 28 | 32.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/herja_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 63 | 60.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/herja_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/herja_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 28 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, thighhighs, bare_shoulders, collarbone, long_sleeves, cleavage, off_shoulder, simple_background, sword, thigh_boots, belt, sweater_dress, white_background, ribbed_sweater |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | looking_at_viewer | thighhighs | bare_shoulders | collarbone | long_sleeves | cleavage | off_shoulder | simple_background | sword | thigh_boots | belt | sweater_dress | white_background | ribbed_sweater |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:-------------|:-----------------|:-------------|:---------------|:-----------|:---------------|:--------------------|:--------|:--------------|:-------|:----------------|:-------------------|:-----------------|
| 0 | 28 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/herja_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T21:01:58+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T21:07:12+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of herja (Granblue Fantasy)
===================================
This is the dataset of herja (Granblue Fantasy), containing 28 images and their tags.
The core tags of this character are 'green\_hair, long\_hair, breasts, ponytail, hair\_ribbon, ribbon, brown\_eyes, large\_breasts, bangs, red\_ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
e77a207b0a131cf83188ff078438e88b112a4717 |
# Dataset of almeida (Granblue Fantasy)
This is the dataset of almeida (Granblue Fantasy), containing 44 images and their tags.
The core tags of this character are `horns, long_hair, pointy_ears, blonde_hair, breasts, dark_skin, dark-skinned_female, very_long_hair, earrings, large_breasts, blue_eyes, bangs, twintails, aqua_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 44 | 54.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/almeida_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 44 | 33.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/almeida_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 103 | 68.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/almeida_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 44 | 49.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/almeida_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 103 | 94.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/almeida_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/almeida_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 44 |  |  |  |  |  | draph, 1girl, jewelry, cleavage, solo, looking_at_viewer, navel_piercing, blush, smile, black_bikini, simple_background, open_mouth, white_background, gloves, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | draph | 1girl | jewelry | cleavage | solo | looking_at_viewer | navel_piercing | blush | smile | black_bikini | simple_background | open_mouth | white_background | gloves | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:----------|:-----------|:-------|:--------------------|:-----------------|:--------|:--------|:---------------|:--------------------|:-------------|:-------------------|:---------|:--------|
| 0 | 44 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/almeida_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T21:02:02+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T21:12:03+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of almeida (Granblue Fantasy)
=====================================
This is the dataset of almeida (Granblue Fantasy), containing 44 images and their tags.
The core tags of this character are 'horns, long\_hair, pointy\_ears, blonde\_hair, breasts, dark\_skin, dark-skinned\_female, very\_long\_hair, earrings, large\_breasts, blue\_eyes, bangs, twintails, aqua\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
1f0213ef2f75aa419fed00cc95c7000230588473 |
# Dataset of mimlemel (Granblue Fantasy)
This is the dataset of mimlemel (Granblue Fantasy), containing 21 images and their tags.
The core tags of this character are `pointy_ears, green_hair, ponytail, hair_ornament, blue_eyes, bow, hair_bow, bangs, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 21 | 12.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mimlemel_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 21 | 10.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mimlemel_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 15.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mimlemel_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 21 | 12.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mimlemel_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 17.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mimlemel_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mimlemel_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, solo, bare_shoulders, bikini, blush, harvin, looking_at_viewer, white_background, barefoot, closed_mouth, navel, simple_background, skirt, ass, bare_arms, bare_legs, clothes_lift, collarbone, from_behind, hair_flower, lifted_by_self, looking_back, short_hair, twitter_username |
| 1 | 5 |  |  |  |  |  | harvin, trumpet, 1girl, hair_flower, barefoot, holding, navel, solo, water, bare_shoulders, bikini_skirt, blonde_hair, collarbone, horns, looking_at_viewer, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | bikini | blush | harvin | looking_at_viewer | white_background | barefoot | closed_mouth | navel | simple_background | skirt | ass | bare_arms | bare_legs | clothes_lift | collarbone | from_behind | hair_flower | lifted_by_self | looking_back | short_hair | twitter_username | trumpet | holding | water | bikini_skirt | blonde_hair | horns | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:---------|:--------|:---------|:--------------------|:-------------------|:-----------|:---------------|:--------|:--------------------|:--------|:------|:------------|:------------|:---------------|:-------------|:--------------|:--------------|:-----------------|:---------------|:-------------|:-------------------|:----------|:----------|:--------|:---------------|:--------------|:--------|:----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | | X | X | | X | | X | | | | | | | X | | X | | | | | X | X | X | X | X | X | X |
| CyberHarem/mimlemel_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T21:02:03+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T21:05:13+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of mimlemel (Granblue Fantasy)
======================================
This is the dataset of mimlemel (Granblue Fantasy), containing 21 images and their tags.
The core tags of this character are 'pointy\_ears, green\_hair, ponytail, hair\_ornament, blue\_eyes, bow, hair\_bow, bangs, purple\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
1d711b348b4aaba0b4b2bf8dbd7da6f2414a175b |
# Dataset Card for Evaluation run of RatanRohith/NeuralPizza-7B-V0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RatanRohith/NeuralPizza-7B-V0.2](https://huggingface.co/RatanRohith/NeuralPizza-7B-V0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-V0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T21:04:32.551333](https://huggingface.co/datasets/open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-V0.2/blob/main/results_2024-01-21T21-04-32.551333.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6473430412019332,
"acc_stderr": 0.03217057797429981,
"acc_norm": 0.64765430784867,
"acc_norm_stderr": 0.032828743060941856,
"mc1": 0.4565483476132191,
"mc1_stderr": 0.017437280953183695,
"mc2": 0.6137860051903924,
"mc2_stderr": 0.015431238638634374
},
"harness|arc:challenge|25": {
"acc": 0.6535836177474402,
"acc_stderr": 0.013905011180063239,
"acc_norm": 0.6877133105802048,
"acc_norm_stderr": 0.013542598541688065
},
"harness|hellaswag|10": {
"acc": 0.6819358693487353,
"acc_stderr": 0.004647727222445386,
"acc_norm": 0.8610834495120494,
"acc_norm_stderr": 0.003451525868724678
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055256,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055256
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895518,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3877094972067039,
"acc_stderr": 0.01629533232815581,
"acc_norm": 0.3877094972067039,
"acc_norm_stderr": 0.01629533232815581
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.012718456618701766,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.012718456618701766
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335307,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724556,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724556
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169143,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169143
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4565483476132191,
"mc1_stderr": 0.017437280953183695,
"mc2": 0.6137860051903924,
"mc2_stderr": 0.015431238638634374
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569563
},
"harness|gsm8k|5": {
"acc": 0.686125852918878,
"acc_stderr": 0.012782681251053201
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-V0.2 | [
"region:us"
] | 2024-01-21T21:06:49+00:00 | {"pretty_name": "Evaluation run of RatanRohith/NeuralPizza-7B-V0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [RatanRohith/NeuralPizza-7B-V0.2](https://huggingface.co/RatanRohith/NeuralPizza-7B-V0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-V0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T21:04:32.551333](https://huggingface.co/datasets/open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-V0.2/blob/main/results_2024-01-21T21-04-32.551333.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6473430412019332,\n \"acc_stderr\": 0.03217057797429981,\n \"acc_norm\": 0.64765430784867,\n \"acc_norm_stderr\": 0.032828743060941856,\n \"mc1\": 0.4565483476132191,\n \"mc1_stderr\": 0.017437280953183695,\n \"mc2\": 0.6137860051903924,\n \"mc2_stderr\": 0.015431238638634374\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6535836177474402,\n \"acc_stderr\": 0.013905011180063239,\n \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688065\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6819358693487353,\n \"acc_stderr\": 0.004647727222445386,\n \"acc_norm\": 0.8610834495120494,\n \"acc_norm_stderr\": 0.003451525868724678\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055256,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055256\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895518,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3877094972067039,\n \"acc_stderr\": 0.01629533232815581,\n \"acc_norm\": 0.3877094972067039,\n \"acc_norm_stderr\": 0.01629533232815581\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n \"acc_stderr\": 0.012718456618701766,\n \"acc_norm\": 0.455019556714472,\n \"acc_norm_stderr\": 0.012718456618701766\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335307,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335307\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724556,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724556\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169143,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169143\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4565483476132191,\n \"mc1_stderr\": 0.017437280953183695,\n \"mc2\": 0.6137860051903924,\n \"mc2_stderr\": 0.015431238638634374\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569563\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.686125852918878,\n \"acc_stderr\": 0.012782681251053201\n }\n}\n```", "repo_url": "https://huggingface.co/RatanRohith/NeuralPizza-7B-V0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|arc:challenge|25_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|gsm8k|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hellaswag|10_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T21-04-32.551333.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["**/details_harness|winogrande|5_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T21-04-32.551333.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T21_04_32.551333", "path": ["results_2024-01-21T21-04-32.551333.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T21-04-32.551333.parquet"]}]}]} | 2024-01-21T21:07:11+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of RatanRohith/NeuralPizza-7B-V0.2
Dataset automatically created during the evaluation run of model RatanRohith/NeuralPizza-7B-V0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T21:04:32.551333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of RatanRohith/NeuralPizza-7B-V0.2\n\n\n\nDataset automatically created during the evaluation run of model RatanRohith/NeuralPizza-7B-V0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T21:04:32.551333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of RatanRohith/NeuralPizza-7B-V0.2\n\n\n\nDataset automatically created during the evaluation run of model RatanRohith/NeuralPizza-7B-V0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T21:04:32.551333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
545b3632b4fa1105f24d3c13baa302abdaaff3b1 |
# Dataset of tiamat (Granblue Fantasy)
This is the dataset of tiamat (Granblue Fantasy), containing 31 images and their tags.
The core tags of this character are `long_hair, blue_hair, pointy_ears, breasts, very_long_hair, red_eyes, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 31 | 23.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 31 | 19.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 57 | 31.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 31 | 22.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 57 | 37.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamat_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tiamat_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------|
| 0 | 31 |  |  |  |  |  | 1girl, solo, bare_shoulders, navel, cleavage, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | navel | cleavage | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------|:-----------|:--------------------|
| 0 | 31 |  |  |  |  |  | X | X | X | X | X | X |
| CyberHarem/tiamat_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T21:25:10+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T21:34:49+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of tiamat (Granblue Fantasy)
====================================
This is the dataset of tiamat (Granblue Fantasy), containing 31 images and their tags.
The core tags of this character are 'long\_hair, blue\_hair, pointy\_ears, breasts, very\_long\_hair, red\_eyes, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
bc1593f1230487e2ff250a7558c1ec2a00fb1d4e |
# Dataset of dorothy (Granblue Fantasy)
This is the dataset of dorothy (Granblue Fantasy), containing 14 images and their tags.
The core tags of this character are `maid_headdress, long_hair, bangs, brown_hair, twintails, blue_eyes, blunt_bangs, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 14 | 16.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dorothy_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 14 | 9.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dorothy_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 23 | 15.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dorothy_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 14 | 14.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dorothy_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 23 | 22.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dorothy_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dorothy_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, solo, looking_at_viewer, dress, holding, maid_apron, frills, full_body, open_mouth, juliet_sleeves, white_background, :d, blush, food, shoes, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | dress | holding | maid_apron | frills | full_body | open_mouth | juliet_sleeves | white_background | :d | blush | food | shoes | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:----------|:-------------|:---------|:------------|:-------------|:-----------------|:-------------------|:-----|:--------|:-------|:--------|:--------------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/dorothy_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T21:25:11+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T21:28:47+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of dorothy (Granblue Fantasy)
=====================================
This is the dataset of dorothy (Granblue Fantasy), containing 14 images and their tags.
The core tags of this character are 'maid\_headdress, long\_hair, bangs, brown\_hair, twintails, blue\_eyes, blunt\_bangs, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
1681a2125c6c8bc1bf34ee9ea4907c06a61cb84d |
# Dataset of alliah (Granblue Fantasy)
This is the dataset of alliah (Granblue Fantasy), containing 11 images and their tags.
The core tags of this character are `animal_ears, bangs, short_hair, grey_hair, yellow_eyes, breasts, brown_eyes, hair_ornament, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 17.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alliah_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 7.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alliah_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 26 | 17.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alliah_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 13.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alliah_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 26 | 28.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alliah_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/alliah_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, erune, solo, blush, cleavage, closed_mouth, smile, bare_shoulders, black_choker, collarbone |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | erune | solo | blush | cleavage | closed_mouth | smile | bare_shoulders | black_choker | collarbone |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:--------|:-----------|:---------------|:--------|:-----------------|:---------------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/alliah_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T21:25:26+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T21:28:25+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of alliah (Granblue Fantasy)
====================================
This is the dataset of alliah (Granblue Fantasy), containing 11 images and their tags.
The core tags of this character are 'animal\_ears, bangs, short\_hair, grey\_hair, yellow\_eyes, breasts, brown\_eyes, hair\_ornament, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
b62fad40a2598189226e73d6c98ca9a5f932747b |
# Dataset of helel_ben_shalem (Granblue Fantasy)
This is the dataset of helel_ben_shalem (Granblue Fantasy), containing 128 images and their tags.
The core tags of this character are `blonde_hair, long_hair, braid, dark-skinned_female, dark_skin, very_long_hair, breasts, bangs, red_eyes, ribbon, hair_ribbon, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 128 | 219.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/helel_ben_shalem_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 128 | 118.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/helel_ben_shalem_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 307 | 246.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/helel_ben_shalem_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 128 | 188.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/helel_ben_shalem_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 307 | 352.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/helel_ben_shalem_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/helel_ben_shalem_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, looking_at_viewer, pacifier, solo, tattoo |
| 1 | 9 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, pacifier, solo, black_nails, fingernails, looking_at_viewer, tattoo, white_background |
| 2 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, pacifier, solo, black_bikini, goggles_on_head, see-through, bare_shoulders, animal_ears, eyewear_on_head, water, blush, navel, wet, armpits, day, sky |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_dress | looking_at_viewer | pacifier | solo | tattoo | black_nails | fingernails | white_background | black_bikini | goggles_on_head | see-through | animal_ears | eyewear_on_head | water | blush | navel | wet | armpits | day | sky |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:--------------------|:-----------|:-------|:---------|:--------------|:--------------|:-------------------|:---------------|:------------------|:--------------|:--------------|:------------------|:--------|:--------|:--------|:------|:----------|:------|:------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 2 | 19 |  |  |  |  |  | X | X | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/helel_ben_shalem_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T21:33:35+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T22:02:09+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of helel\_ben\_shalem (Granblue Fantasy)
================================================
This is the dataset of helel\_ben\_shalem (Granblue Fantasy), containing 128 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, braid, dark-skinned\_female, dark\_skin, very\_long\_hair, breasts, bangs, red\_eyes, ribbon, hair\_ribbon, medium\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
950fedffeab9274bc8a41bc6935c617f43a4f039 |
# Dataset of jeanne_d_arc (Granblue Fantasy)
This is the dataset of jeanne_d_arc (Granblue Fantasy), containing 314 images and their tags.
The core tags of this character are `blonde_hair, long_hair, hair_ornament, breasts, blue_eyes, hair_flower, hairband, large_breasts, bangs, hair_intakes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 314 | 432.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 314 | 261.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 759 | 546.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 314 | 391.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 759 | 739.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jeanne_d_arc_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jeanne_d_arc_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, bare_shoulders, flower, looking_at_viewer, solo, white_dress, ahoge, detached_sleeves, blush, thighs, hair_between_eyes, white_background |
| 1 | 10 |  |  |  |  |  | 1girl, solo, white_dress, bare_shoulders, flower, gauntlets, looking_at_viewer, thighhighs, ahoge, flag, greaves, thigh_boots, armored_boots, blush, sword |
| 2 | 14 |  |  |  |  |  | 1girl, solo, thighhighs, medium_breasts, cleavage, gauntlets, holding_sword, bare_shoulders, flag, looking_at_viewer, very_long_hair, lily_(flower), armored_dress, collarbone, thigh_boots |
| 3 | 6 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, very_long_hair, white_dress, ahoge, blush, thigh_boots, thighhighs, detached_sleeves, flower, hair_between_eyes, sitting, smile |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, looking_at_viewer, official_alternate_costume, purple_bikini, solo, collarbone, simple_background, smile, lily_(flower), upper_body, white_background, blush, parted_lips |
| 5 | 13 |  |  |  |  |  | 1girl, cleavage, flower, looking_at_viewer, official_alternate_costume, purple_bikini, solo, bare_shoulders, blush, collarbone, navel, side-tie_bikini_bottom, diadem, simple_background, front-tie_bikini_top, white_background, parted_lips, ponytail, smile, see-through |
| 6 | 5 |  |  |  |  |  | 1girl, armpits, arms_behind_head, arms_up, cleavage, flower, looking_at_viewer, navel, official_alternate_costume, purple_bikini, solo, blush, smile, bare_shoulders, collarbone, diadem, front-tie_bikini_top, mouth_hold, purple_eyes, side-tie_bikini_bottom |
| 7 | 24 |  |  |  |  |  | 1girl, cleavage, day, official_alternate_costume, purple_bikini, flower, looking_at_viewer, outdoors, solo, blush, navel, ocean, beach, collarbone, bare_shoulders, blue_sky, side-tie_bikini_bottom, cloud, smile, front-tie_bikini_top, hair_between_eyes, armpits, diadem |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | flower | looking_at_viewer | solo | white_dress | ahoge | detached_sleeves | blush | thighs | hair_between_eyes | white_background | gauntlets | thighhighs | flag | greaves | thigh_boots | armored_boots | sword | medium_breasts | cleavage | holding_sword | very_long_hair | lily_(flower) | armored_dress | collarbone | sitting | smile | official_alternate_costume | purple_bikini | simple_background | upper_body | parted_lips | navel | side-tie_bikini_bottom | diadem | front-tie_bikini_top | ponytail | see-through | armpits | arms_behind_head | arms_up | mouth_hold | purple_eyes | day | outdoors | ocean | beach | blue_sky | cloud |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------|:--------------------|:-------|:--------------|:--------|:-------------------|:--------|:---------|:--------------------|:-------------------|:------------|:-------------|:-------|:----------|:--------------|:----------------|:--------|:-----------------|:-----------|:----------------|:-----------------|:----------------|:----------------|:-------------|:----------|:--------|:-----------------------------|:----------------|:--------------------|:-------------|:--------------|:--------|:-------------------------|:---------|:-----------------------|:-----------|:--------------|:----------|:-------------------|:----------|:-------------|:--------------|:------|:-----------|:--------|:--------|:-----------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | | X | X | | | | | | | | X | X | X | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | | | X | | | X | | | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | X | X | | | | X | | | X | | | | | | | | | X | | | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 13 |  |  |  |  |  | X | X | X | X | X | | | | X | | | X | | | | | | | | | X | | | | | X | | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | X | X | | | | X | | | | | | | | | | | | X | | | | | X | | X | X | X | | | | X | X | X | X | | | X | X | X | X | X | | | | | | |
| 7 | 24 |  |  |  |  |  | X | X | X | X | X | | | | X | | X | | | | | | | | | | X | | | | | X | | X | X | X | | | | X | X | X | X | | | X | | | | | X | X | X | X | X | X |
| CyberHarem/jeanne_d_arc_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T21:33:39+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-22T05:02:48+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of jeanne\_d\_arc (Granblue Fantasy)
============================================
This is the dataset of jeanne\_d\_arc (Granblue Fantasy), containing 314 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, hair\_ornament, breasts, blue\_eyes, hair\_flower, hairband, large\_breasts, bangs, hair\_intakes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
e73e125af5df79d67fd1b7046983858a6c960811 |
# Dataset of kukuru (Granblue Fantasy)
This is the dataset of kukuru (Granblue Fantasy), containing 134 images and their tags.
The core tags of this character are `blue_eyes, blonde_hair, long_hair, breasts, ribbon, twintails, hat, hair_ribbon, beret, medium_breasts, bangs, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 134 | 165.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kukuru_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 134 | 101.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kukuru_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 335 | 219.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kukuru_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 134 | 148.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kukuru_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 335 | 291.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kukuru_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kukuru_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, solo, smile, looking_at_viewer, bare_shoulders, blush, open_mouth, belt, collarbone, skirt, cleavage, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, halloween, looking_at_viewer, solo, white_gloves, witch_hat, smile, bow, detached_sleeves, dress, jack-o'-lantern, open_mouth, pumpkin, broom, white_headwear, blush, boots, cape, cleavage, high_heels, skirt |
| 2 | 8 |  |  |  |  |  | 1girl, blush, hetero, solo_focus, 1boy, nipples, penis, open_mouth, sex, vaginal, smile, sweat, blue_ribbon, collarbone, cum_in_pussy, mosaic_censoring, nude, one_eye_closed, spread_legs, tears |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | looking_at_viewer | bare_shoulders | blush | open_mouth | belt | collarbone | skirt | cleavage | white_background | halloween | white_gloves | witch_hat | bow | detached_sleeves | dress | jack-o'-lantern | pumpkin | broom | white_headwear | boots | cape | high_heels | hetero | solo_focus | 1boy | nipples | penis | sex | vaginal | sweat | blue_ribbon | cum_in_pussy | mosaic_censoring | nude | one_eye_closed | spread_legs | tears |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:-----------------|:--------|:-------------|:-------|:-------------|:--------|:-----------|:-------------------|:------------|:---------------|:------------|:------|:-------------------|:--------|:------------------|:----------|:--------|:-----------------|:--------|:-------|:-------------|:---------|:-------------|:-------|:----------|:--------|:------|:----------|:--------|:--------------|:---------------|:-------------------|:-------|:-----------------|:--------------|:--------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | X | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | X | | | X | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kukuru_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T21:33:51+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T22:02:19+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of kukuru (Granblue Fantasy)
====================================
This is the dataset of kukuru (Granblue Fantasy), containing 134 images and their tags.
The core tags of this character are 'blue\_eyes, blonde\_hair, long\_hair, breasts, ribbon, twintails, hat, hair\_ribbon, beret, medium\_breasts, bangs, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
cb608c76c93037712c7bb8e6379d2b27a99c4686 |
# Dataset Card for Evaluation run of The-Face-Of-Goonery/Huginn-V5-10.7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [The-Face-Of-Goonery/Huginn-V5-10.7B](https://huggingface.co/The-Face-Of-Goonery/Huginn-V5-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-V5-10.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T21:34:43.018988](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-V5-10.7B/blob/main/results_2024-01-21T21-34-43.018988.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5406753435067956,
"acc_stderr": 0.034178118270674414,
"acc_norm": 0.5478844753260135,
"acc_norm_stderr": 0.03492250437234291,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.44516734672506053,
"mc2_stderr": 0.015020540718678
},
"harness|arc:challenge|25": {
"acc": 0.5742320819112628,
"acc_stderr": 0.01444946427886881,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104298
},
"harness|hellaswag|10": {
"acc": 0.5738896634136627,
"acc_stderr": 0.004934995402995944,
"acc_norm": 0.7879904401513643,
"acc_norm_stderr": 0.004078962503408526
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562413,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562413
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.567741935483871,
"acc_stderr": 0.028181739720019416,
"acc_norm": 0.567741935483871,
"acc_norm_stderr": 0.028181739720019416
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.03438157967036543,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.03438157967036543
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165636,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165636
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03274287914026868,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03274287914026868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117467,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.025334667080954915,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.025334667080954915
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7431192660550459,
"acc_stderr": 0.018732492928342472,
"acc_norm": 0.7431192660550459,
"acc_norm_stderr": 0.018732492928342472
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.033096825811190354,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.033096825811190354
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.0341078533890472,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.0341078533890472
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.02969633871342288,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.02969633871342288
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.032867453125679603,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.032867453125679603
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404034,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404034
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7215836526181354,
"acc_stderr": 0.016028295188992462,
"acc_norm": 0.7215836526181354,
"acc_norm_stderr": 0.016028295188992462
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.026538189104705477,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.026538189104705477
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.016104833880142288,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.016104833880142288
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.028431095444176643,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.028431095444176643
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.02801365189199507,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.02801365189199507
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271143,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39113428943937417,
"acc_stderr": 0.01246386183998206,
"acc_norm": 0.39113428943937417,
"acc_norm_stderr": 0.01246386183998206
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.03018753206032938,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.03018753206032938
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.020130388312904524,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.020130388312904524
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670237,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670237
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.0300210562384403,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.0300210562384403
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6019900497512438,
"acc_stderr": 0.034611994290400135,
"acc_norm": 0.6019900497512438,
"acc_norm_stderr": 0.034611994290400135
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.44516734672506053,
"mc2_stderr": 0.015020540718678
},
"harness|winogrande|5": {
"acc": 0.7371744277821626,
"acc_stderr": 0.01237092252726201
},
"harness|gsm8k|5": {
"acc": 0.19181197877179681,
"acc_stderr": 0.010845169955294016
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-V5-10.7B | [
"region:us"
] | 2024-01-21T21:37:04+00:00 | {"pretty_name": "Evaluation run of The-Face-Of-Goonery/Huginn-V5-10.7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [The-Face-Of-Goonery/Huginn-V5-10.7B](https://huggingface.co/The-Face-Of-Goonery/Huginn-V5-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-V5-10.7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T21:34:43.018988](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-V5-10.7B/blob/main/results_2024-01-21T21-34-43.018988.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5406753435067956,\n \"acc_stderr\": 0.034178118270674414,\n \"acc_norm\": 0.5478844753260135,\n \"acc_norm_stderr\": 0.03492250437234291,\n \"mc1\": 0.28886168910648713,\n \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.44516734672506053,\n \"mc2_stderr\": 0.015020540718678\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5742320819112628,\n \"acc_stderr\": 0.01444946427886881,\n \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104298\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5738896634136627,\n \"acc_stderr\": 0.004934995402995944,\n \"acc_norm\": 0.7879904401513643,\n \"acc_norm_stderr\": 0.004078962503408526\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562413,\n \"acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562413\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.567741935483871,\n \"acc_stderr\": 0.028181739720019416,\n \"acc_norm\": 0.567741935483871,\n \"acc_norm_stderr\": 0.028181739720019416\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036543,\n \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036543\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165636,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165636\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03274287914026868,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03274287914026868\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117467,\n \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117467\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.025334667080954915,\n \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.025334667080954915\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7431192660550459,\n \"acc_stderr\": 0.018732492928342472,\n \"acc_norm\": 0.7431192660550459,\n \"acc_norm_stderr\": 0.018732492928342472\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37962962962962965,\n \"acc_stderr\": 0.033096825811190354,\n \"acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.033096825811190354\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.0341078533890472,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.0341078533890472\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7046413502109705,\n \"acc_stderr\": 0.02969633871342288,\n \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.02969633871342288\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n \"acc_stderr\": 0.032867453125679603,\n \"acc_norm\": 0.600896860986547,\n \"acc_norm_stderr\": 0.032867453125679603\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n \"acc_stderr\": 0.02645350805404034,\n \"acc_norm\": 0.7948717948717948,\n \"acc_norm_stderr\": 0.02645350805404034\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7215836526181354,\n \"acc_stderr\": 0.016028295188992462,\n \"acc_norm\": 0.7215836526181354,\n \"acc_norm_stderr\": 0.016028295188992462\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.026538189104705477,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.026538189104705477\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n \"acc_stderr\": 0.016104833880142288,\n \"acc_norm\": 0.3653631284916201,\n \"acc_norm_stderr\": 0.016104833880142288\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.028431095444176643,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.028431095444176643\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n \"acc_stderr\": 0.02801365189199507,\n \"acc_norm\": 0.5819935691318328,\n \"acc_norm_stderr\": 0.02801365189199507\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271143,\n \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271143\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39113428943937417,\n \"acc_stderr\": 0.01246386183998206,\n \"acc_norm\": 0.39113428943937417,\n \"acc_norm_stderr\": 0.01246386183998206\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.03018753206032938,\n \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.03018753206032938\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.020130388312904524,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.020130388312904524\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670237,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670237\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.0300210562384403,\n \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.0300210562384403\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6019900497512438,\n \"acc_stderr\": 0.034611994290400135,\n \"acc_norm\": 0.6019900497512438,\n \"acc_norm_stderr\": 0.034611994290400135\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.44516734672506053,\n \"mc2_stderr\": 0.015020540718678\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.01237092252726201\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19181197877179681,\n \"acc_stderr\": 0.010845169955294016\n }\n}\n```", "repo_url": "https://huggingface.co/The-Face-Of-Goonery/Huginn-V5-10.7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|arc:challenge|25_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|gsm8k|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hellaswag|10_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T21-34-43.018988.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["**/details_harness|winogrande|5_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T21-34-43.018988.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T21_34_43.018988", "path": ["results_2024-01-21T21-34-43.018988.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T21-34-43.018988.parquet"]}]}]} | 2024-01-21T21:37:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of The-Face-Of-Goonery/Huginn-V5-10.7B
Dataset automatically created during the evaluation run of model The-Face-Of-Goonery/Huginn-V5-10.7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T21:34:43.018988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of The-Face-Of-Goonery/Huginn-V5-10.7B\n\n\n\nDataset automatically created during the evaluation run of model The-Face-Of-Goonery/Huginn-V5-10.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T21:34:43.018988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of The-Face-Of-Goonery/Huginn-V5-10.7B\n\n\n\nDataset automatically created during the evaluation run of model The-Face-Of-Goonery/Huginn-V5-10.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T21:34:43.018988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
422a31c4c61059e4cac839c71f3451ab8829b869 |
# Dataset Card for Evaluation run of flemmingmiguel/MBX-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [flemmingmiguel/MBX-7B](https://huggingface.co/flemmingmiguel/MBX-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__MBX-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T21:40:54.851696](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MBX-7B/blob/main/results_2024-01-21T21-40-54.851696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6554465222637362,
"acc_stderr": 0.03202723759090997,
"acc_norm": 0.6546590340930132,
"acc_norm_stderr": 0.03269786833995499,
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.6911254426672954,
"mc2_stderr": 0.015116841160158632
},
"harness|arc:challenge|25": {
"acc": 0.7056313993174061,
"acc_stderr": 0.013318528460539422,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545797
},
"harness|hellaswag|10": {
"acc": 0.7148974307906791,
"acc_stderr": 0.004505406176606851,
"acc_norm": 0.883788090021908,
"acc_norm_stderr": 0.0031982389518176264
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404907,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658537,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658537
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079069,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.6911254426672954,
"mc2_stderr": 0.015116841160158632
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273768
},
"harness|gsm8k|5": {
"acc": 0.7126611068991661,
"acc_stderr": 0.01246467706010708
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_flemmingmiguel__MBX-7B | [
"region:us"
] | 2024-01-21T21:43:13+00:00 | {"pretty_name": "Evaluation run of flemmingmiguel/MBX-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [flemmingmiguel/MBX-7B](https://huggingface.co/flemmingmiguel/MBX-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__MBX-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T21:40:54.851696](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MBX-7B/blob/main/results_2024-01-21T21-40-54.851696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6554465222637362,\n \"acc_stderr\": 0.03202723759090997,\n \"acc_norm\": 0.6546590340930132,\n \"acc_norm_stderr\": 0.03269786833995499,\n \"mc1\": 0.5520195838433293,\n \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.6911254426672954,\n \"mc2_stderr\": 0.015116841160158632\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7056313993174061,\n \"acc_stderr\": 0.013318528460539422,\n \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545797\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7148974307906791,\n \"acc_stderr\": 0.004505406176606851,\n \"acc_norm\": 0.883788090021908,\n \"acc_norm_stderr\": 0.0031982389518176264\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404907,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658537,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658537\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079069,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079069\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.6911254426672954,\n \"mc2_stderr\": 0.015116841160158632\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273768\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7126611068991661,\n \"acc_stderr\": 0.01246467706010708\n }\n}\n```", "repo_url": "https://huggingface.co/flemmingmiguel/MBX-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|arc:challenge|25_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|gsm8k|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hellaswag|10_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T21-40-54.851696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["**/details_harness|winogrande|5_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T21-40-54.851696.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T21_40_54.851696", "path": ["results_2024-01-21T21-40-54.851696.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T21-40-54.851696.parquet"]}]}]} | 2024-01-21T21:43:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of flemmingmiguel/MBX-7B
Dataset automatically created during the evaluation run of model flemmingmiguel/MBX-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T21:40:54.851696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of flemmingmiguel/MBX-7B\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/MBX-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T21:40:54.851696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of flemmingmiguel/MBX-7B\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/MBX-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T21:40:54.851696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
4197c28ae55864525f333e8e8e0a893a9bb466cf |
# Dataset of kumuyu (Granblue Fantasy)
This is the dataset of kumuyu (Granblue Fantasy), containing 16 images and their tags.
The core tags of this character are `horns, breasts, large_breasts, twintails, long_hair, brown_eyes, yellow_eyes, low_twintails, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 21.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumuyu_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 12.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumuyu_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 38 | 25.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumuyu_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 19.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumuyu_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 38 | 37.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumuyu_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kumuyu_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, draph, solo, looking_at_viewer, collarbone, hood, long_sleeves, open_mouth, sweat, wide_sleeves, breasts_out, inverted_nipples, navel, smile, tears |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | draph | solo | looking_at_viewer | collarbone | hood | long_sleeves | open_mouth | sweat | wide_sleeves | breasts_out | inverted_nipples | navel | smile | tears |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:-------|:--------------------|:-------------|:-------|:---------------|:-------------|:--------|:---------------|:--------------|:-------------------|:--------|:--------|:--------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kumuyu_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T21:47:02+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T21:50:36+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of kumuyu (Granblue Fantasy)
====================================
This is the dataset of kumuyu (Granblue Fantasy), containing 16 images and their tags.
The core tags of this character are 'horns, breasts, large\_breasts, twintails, long\_hair, brown\_eyes, yellow\_eyes, low\_twintails, grey\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
3a60cd21715571dbac990e453980c0f37a46d2d7 |
# Dataset of nemone (Granblue Fantasy)
This is the dataset of nemone (Granblue Fantasy), containing 111 images and their tags.
The core tags of this character are `animal_ears, dark_skin, dark-skinned_female, hair_ornament, long_hair, breasts, hair_flower, green_eyes, grey_hair, cat_ears, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 111 | 122.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nemone_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 111 | 81.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nemone_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 250 | 164.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nemone_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 111 | 114.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nemone_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 250 | 219.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nemone_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nemone_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 46 |  |  |  |  |  | 1girl, erune, solo, flower, looking_at_viewer, smile, black_gloves, open_mouth, blush, sideboob, simple_background, white_background, bare_shoulders |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | erune | solo | flower | looking_at_viewer | smile | black_gloves | open_mouth | blush | sideboob | simple_background | white_background | bare_shoulders |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:---------|:--------------------|:--------|:---------------|:-------------|:--------|:-----------|:--------------------|:-------------------|:-----------------|
| 0 | 46 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/nemone_granbluefantasy | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-21T21:47:08+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-21T22:04:17+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of nemone (Granblue Fantasy)
====================================
This is the dataset of nemone (Granblue Fantasy), containing 111 images and their tags.
The core tags of this character are 'animal\_ears, dark\_skin, dark-skinned\_female, hair\_ornament, long\_hair, breasts, hair\_flower, green\_eyes, grey\_hair, cat\_ears, medium\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
ff43d49c5b077c8fa5e8b9aa2a781bcae8b0efd4 |
# Dataset Card for Evaluation run of cognitivecomputations/MegaDolphin-120b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/MegaDolphin-120b](https://huggingface.co/cognitivecomputations/MegaDolphin-120b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__MegaDolphin-120b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T21:48:58.549252](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__MegaDolphin-120b/blob/main/results_2024-01-21T21-48-58.549252.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6902282983987049,
"acc_stderr": 0.03061680900457877,
"acc_norm": 0.6956826713561578,
"acc_norm_stderr": 0.03120808276540501,
"mc1": 0.41370869033047736,
"mc1_stderr": 0.017240861812099804,
"mc2": 0.592821117756168,
"mc2_stderr": 0.015249093012153285
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620196,
"acc_norm": 0.6902730375426621,
"acc_norm_stderr": 0.013512058415238361
},
"harness|hellaswag|10": {
"acc": 0.7026488747261501,
"acc_stderr": 0.004561582009834577,
"acc_norm": 0.8780123481378211,
"acc_norm_stderr": 0.0032660269509226444
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882923,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882923
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.0271342916287417,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.0271342916287417
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802268,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802268
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6893617021276596,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.6893617021276596,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.025722097064388525,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.025722097064388525
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.02141724293632158,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.02141724293632158
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.02482590979334335,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.02482590979334335
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7461538461538462,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.7461538461538462,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606647,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606647
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.02684151432295894,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.02684151432295894
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.01370874953417264,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.01370874953417264
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073315,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.020343400734868834,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.020343400734868834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.02758406660220827,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.02758406660220827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622814,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622814
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8466257668711656,
"acc_stderr": 0.028311601441438607,
"acc_norm": 0.8466257668711656,
"acc_norm_stderr": 0.028311601441438607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822582,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822582
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924974,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8441890166028098,
"acc_stderr": 0.012969269247762578,
"acc_norm": 0.8441890166028098,
"acc_norm_stderr": 0.012969269247762578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5150837988826815,
"acc_stderr": 0.01671489037999606,
"acc_norm": 0.5150837988826815,
"acc_norm_stderr": 0.01671489037999606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.02161380939522479,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.02161380939522479
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5867014341590613,
"acc_stderr": 0.012576779494860076,
"acc_norm": 0.5867014341590613,
"acc_norm_stderr": 0.012576779494860076
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740533,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740533
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146606,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146606
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.02688214492230774,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.02688214492230774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160896,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160896
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41370869033047736,
"mc1_stderr": 0.017240861812099804,
"mc2": 0.592821117756168,
"mc2_stderr": 0.015249093012153285
},
"harness|winogrande|5": {
"acc": 0.8184688239936859,
"acc_stderr": 0.010833276515007508
},
"harness|gsm8k|5": {
"acc": 0.4624715693707354,
"acc_stderr": 0.013733636059107756
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cognitivecomputations__MegaDolphin-120b | [
"region:us"
] | 2024-01-21T21:51:17+00:00 | {"pretty_name": "Evaluation run of cognitivecomputations/MegaDolphin-120b", "dataset_summary": "Dataset automatically created during the evaluation run of model [cognitivecomputations/MegaDolphin-120b](https://huggingface.co/cognitivecomputations/MegaDolphin-120b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__MegaDolphin-120b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T21:48:58.549252](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__MegaDolphin-120b/blob/main/results_2024-01-21T21-48-58.549252.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6902282983987049,\n \"acc_stderr\": 0.03061680900457877,\n \"acc_norm\": 0.6956826713561578,\n \"acc_norm_stderr\": 0.03120808276540501,\n \"mc1\": 0.41370869033047736,\n \"mc1_stderr\": 0.017240861812099804,\n \"mc2\": 0.592821117756168,\n \"mc2_stderr\": 0.015249093012153285\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620196,\n \"acc_norm\": 0.6902730375426621,\n \"acc_norm_stderr\": 0.013512058415238361\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7026488747261501,\n \"acc_stderr\": 0.004561582009834577,\n \"acc_norm\": 0.8780123481378211,\n \"acc_norm_stderr\": 0.0032660269509226444\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882923,\n \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882923\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.0271342916287417,\n \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.0271342916287417\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n \"acc_stderr\": 0.03216600808802268,\n \"acc_norm\": 0.8194444444444444,\n \"acc_norm_stderr\": 0.03216600808802268\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6893617021276596,\n \"acc_stderr\": 0.03025123757921317,\n \"acc_norm\": 0.6893617021276596,\n \"acc_norm_stderr\": 0.03025123757921317\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.025722097064388525,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.025722097064388525\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n \"acc_stderr\": 0.02141724293632158,\n \"acc_norm\": 0.8290322580645161,\n \"acc_norm_stderr\": 0.02141724293632158\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334335,\n \"acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334335\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7461538461538462,\n \"acc_stderr\": 0.022066054378726257,\n \"acc_norm\": 0.7461538461538462,\n \"acc_norm_stderr\": 0.022066054378726257\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606647,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606647\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.02684151432295894,\n \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.02684151432295894\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073315,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073315\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868834,\n \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n \"acc_stderr\": 0.02758406660220827,\n \"acc_norm\": 0.7847533632286996,\n \"acc_norm_stderr\": 0.02758406660220827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622814,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622814\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8466257668711656,\n \"acc_stderr\": 0.028311601441438607,\n \"acc_norm\": 0.8466257668711656,\n \"acc_norm_stderr\": 0.028311601441438607\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822582,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822582\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n \"acc_stderr\": 0.019119892798924974,\n \"acc_norm\": 0.905982905982906,\n \"acc_norm_stderr\": 0.019119892798924974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8441890166028098,\n \"acc_stderr\": 0.012969269247762578,\n \"acc_norm\": 0.8441890166028098,\n \"acc_norm_stderr\": 0.012969269247762578\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5150837988826815,\n \"acc_stderr\": 0.01671489037999606,\n \"acc_norm\": 0.5150837988826815,\n \"acc_norm_stderr\": 0.01671489037999606\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.02161380939522479,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.02161380939522479\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5867014341590613,\n \"acc_stderr\": 0.012576779494860076,\n \"acc_norm\": 0.5867014341590613,\n \"acc_norm_stderr\": 0.012576779494860076\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740533,\n \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740533\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146606,\n \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146606\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.02688214492230774,\n \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.02688214492230774\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160896,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160896\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41370869033047736,\n \"mc1_stderr\": 0.017240861812099804,\n \"mc2\": 0.592821117756168,\n \"mc2_stderr\": 0.015249093012153285\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8184688239936859,\n \"acc_stderr\": 0.010833276515007508\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4624715693707354,\n \"acc_stderr\": 0.013733636059107756\n }\n}\n```", "repo_url": "https://huggingface.co/cognitivecomputations/MegaDolphin-120b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|arc:challenge|25_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|gsm8k|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hellaswag|10_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T21-48-58.549252.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["**/details_harness|winogrande|5_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T21-48-58.549252.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T21_48_58.549252", "path": ["results_2024-01-21T21-48-58.549252.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T21-48-58.549252.parquet"]}]}]} | 2024-01-21T21:51:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cognitivecomputations/MegaDolphin-120b
Dataset automatically created during the evaluation run of model cognitivecomputations/MegaDolphin-120b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-21T21:48:58.549252(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cognitivecomputations/MegaDolphin-120b\n\n\n\nDataset automatically created during the evaluation run of model cognitivecomputations/MegaDolphin-120b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T21:48:58.549252(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cognitivecomputations/MegaDolphin-120b\n\n\n\nDataset automatically created during the evaluation run of model cognitivecomputations/MegaDolphin-120b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-21T21:48:58.549252(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.