Spaces:
Running
on
Zero
Running
on
Zero
Commit
·
fafffc2
1
Parent(s):
fdd134b
Auto Daily Leaderboard udpate Wed Dec 11 12:00:32 PM EST 2024
Browse files- arena_elo/results/20241211/clean_battle_image_editing.json +0 -0
- arena_elo/results/20241211/clean_battle_t2i_generation.json +0 -0
- arena_elo/results/20241211/clean_battle_video_generation.json +0 -0
- arena_elo/results/20241211/elo_results_image_editing.pkl +3 -0
- arena_elo/results/20241211/elo_results_t2i_generation.pkl +3 -0
- arena_elo/results/20241211/elo_results_video_generation.pkl +3 -0
- arena_elo/results/20241211/image_editing_leaderboard.csv +11 -0
- arena_elo/results/20241211/t2i_generation_leaderboard.csv +18 -0
- arena_elo/results/20241211/video_generation_leaderboard.csv +14 -0
- arena_elo/results/latest/clean_battle_image_editing.json +192 -0
- arena_elo/results/latest/clean_battle_t2i_generation.json +392 -0
- arena_elo/results/latest/clean_battle_video_generation.json +196 -0
- arena_elo/results/latest/elo_results_image_editing.pkl +2 -2
- arena_elo/results/latest/elo_results_t2i_generation.pkl +2 -2
- arena_elo/results/latest/elo_results_video_generation.pkl +1 -1
- arena_elo/results/latest/image_editing_leaderboard.csv +10 -9
- arena_elo/results/latest/t2i_generation_leaderboard.csv +17 -17
- arena_elo/results/latest/video_generation_leaderboard.csv +13 -13
arena_elo/results/20241211/clean_battle_image_editing.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
arena_elo/results/20241211/clean_battle_t2i_generation.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
arena_elo/results/20241211/clean_battle_video_generation.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
arena_elo/results/20241211/elo_results_image_editing.pkl
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:fa959671e9c719a0a52106fd45cb9e3bbbb5dc957b919b27f78e554d4e586354
|
3 |
+
size 66042
|
arena_elo/results/20241211/elo_results_t2i_generation.pkl
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:6cda80aaece5c0f36b1e8972a6aa82c8c0212ce5682d09157826283573348ae5
|
3 |
+
size 88249
|
arena_elo/results/20241211/elo_results_video_generation.pkl
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:cdba33bfac60c9a15179bcbd0c514784f74631dae0f8029b6396dd8dc10030ea
|
3 |
+
size 75108
|
arena_elo/results/20241211/image_editing_leaderboard.csv
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
+
MagicBrush,MagicBrush,1099.0867788929968,1103.1120557156723,CC-BY-4.0,"The Ohio State University, University of Waterloo",https://osu-nlp-group.github.io/MagicBrush/
|
3 |
+
InfEdit,InfEdit,1064.570347037348,1064.3480021089454,CC BY-NC-ND 4.0,"University of Michigan, University of California, Berkeley",https://sled-group.github.io/InfEdit/
|
4 |
+
UltraEdit,UltraEdit,1060.1430521278826,1060.1858880341144,other,Peking University; BIGAI,https://ultra-editing.github.io/
|
5 |
+
CosXLEdit,CosXLEdit,1059.3947130775634,1060.33362522597,cosxl-nc-community,Stability AI,https://huggingface.co/stabilityai/cosxl
|
6 |
+
InstructPix2Pix,InstructPix2Pix,1033.0249954054239,1030.7488651531455,"Copyright 2023 Timothy Brooks, Aleksander Holynski, Alexei A. Efros","University of California, Berkeley",https://www.timothybrooks.com/instruct-pix2pix
|
7 |
+
PNP,PNP,992.2660415414323,996.7618215250773,-,Weizmann Institute of Science,https://github.com/MichalGeyer/plug-and-play
|
8 |
+
Prompt2prompt,Prompt2prompt,984.5203259478594,985.5845402588227,Apache-2.0,"Google, Tel Aviv University",https://prompt-to-prompt.github.io/
|
9 |
+
CycleDiffusion,CycleDiffusion,935.5392021913754,929.2791002660932,X11,Carnegie Mellon University,https://github.com/ChenWu98/cycle-diffusion?tab=readme-ov-file
|
10 |
+
SDEdit,SDEdit,919.3197119687734,917.7697171405332,MIT License,Stanford University,https://sde-image-editing.github.io
|
11 |
+
Pix2PixZero,Pix2PixZero,852.1348318093445,851.8763845716259,MIT License,"Carnegie Mellon University, Adobe Research",https://pix2pixzero.github.io/
|
arena_elo/results/20241211/t2i_generation_leaderboard.csv
ADDED
@@ -0,0 +1,18 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
+
FLUX.1-dev,FLUX.1-dev,1121.4486601242502,1128.197250918009,flux-1-dev-non-commercial-license (other),Black Forest Labs,https://huggingface.co/docs/diffusers/main/en/api/pipelines/flux
|
3 |
+
PlayGround V2.5,PlayGround V2.5,1117.3965302826998,1117.608362537415,Playground v2.5 Community License,Playground,https://huggingface.co/playgroundai/playground-v2.5-1024px-aesthetic
|
4 |
+
FLUX.1-schnell,FLUX.1-schnell,1091.9943954765995,1098.5515522727444,Apache-2.0,Black Forest Labs,https://huggingface.co/docs/diffusers/main/en/api/pipelines/flux
|
5 |
+
PlayGround V2,PlayGround V2,1073.4358779290842,1072.7546789340154,Playground v2 Community License,Playground,https://huggingface.co/playgroundai/playground-v2-1024px-aesthetic
|
6 |
+
Kolors,Kolors,1051.6210331030522,1050.8529262891388,Apache-2.0,Kwai Kolors,https://huggingface.co/Kwai-Kolors/Kolors
|
7 |
+
StableCascade,StableCascade,1041.4165535435802,1044.1606535376802,stable-cascade-nc-community (other),Stability AI,https://fal.ai/models/stable-cascade/api
|
8 |
+
HunyuanDiT,HunyuanDiT,1022.9312260549116,1016.9352728465166,tencent-hunyuan-community,Tencent,https://github.com/Tencent/HunyuanDiT
|
9 |
+
PixArtAlpha,PixArtAlpha,1020.1155281805594,1012.2359305243641,openrail++,PixArt-alpha,https://huggingface.co/PixArt-alpha/PixArt-XL-2-1024-MS
|
10 |
+
PixArtSigma,PixArtSigma,1019.1321764438638,1019.3947143326678,openrail++,PixArt-alpha,https://github.com/PixArt-alpha/PixArt-sigma
|
11 |
+
SDXL-Lightning,SDXL-Lightning,1018.9162837302384,1023.1896068510646,openrail++,ByteDance,https://huggingface.co/ByteDance/SDXL-Lightning
|
12 |
+
SD3,SD3,1008.3923555357085,1011.2183103003182,stabilityai-nc-research-community,Stability AI,https://huggingface.co/blog/sd3
|
13 |
+
AuraFlow,AuraFlow,997.728802839903,994.0520977829948,Apache-2.0,Fal.AI,https://huggingface.co/fal/AuraFlow
|
14 |
+
SDXL,SDXL,968.6355564577761,969.2539822246374,openrail++,Stability AI,https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0
|
15 |
+
SDXLTurbo,SDXLTurbo,915.5256581329278,913.7937287705809,sai-nc-community (other),Stability AI,https://huggingface.co/stabilityai/sdxl-turbo
|
16 |
+
LCM(v1.5/XL),LCM(v1.5/XL),906.202164093364,900.2717108462446,openrail++,Latent Consistency,https://fal.ai/models/fast-lcm-diffusion-turbo
|
17 |
+
OpenJourney,OpenJourney,833.4517296135969,828.4807395699017,creativeml-openrail-m,PromptHero,https://huggingface.co/prompthero/openjourney
|
18 |
+
LCM,LCM,791.6554684578832,806.0704110447675,MIT License,Tsinghua University,https://huggingface.co/SimianLuo/LCM_Dreamshaper_v7
|
arena_elo/results/20241211/video_generation_leaderboard.csv
ADDED
@@ -0,0 +1,14 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
+
CogVideoX-5B,CogVideoX-5B,1149.0467990559753,1143.8489720202278,CogVideoX LICENSE,THUDM,https://github.com/THUDM/CogVideo
|
3 |
+
Pyramid Flow,Pyramid Flow,1142.2012500842695,1144.6306640141165,MIT LICENSE,Peking University,https://pyramid-flow.github.io/
|
4 |
+
StableVideoDiffusion,StableVideoDiffusion,1123.459174455383,1125.9300224510548,SVD-nc-community,Stability AI,https://fal.ai/models/fal-ai/fast-svd/text-to-video/api
|
5 |
+
CogVideoX-2B,CogVideoX-2B,1068.3442892756807,1064.5113173896357,CogVideoX LICENSE,THUDM,https://github.com/THUDM/CogVideo
|
6 |
+
T2V-Turbo,T2V-Turbo,1054.0619737597983,1054.052143493999,cc-by-nc-4.0,"University of California, Santa Barbara",https://github.com/Ji4chenLi/t2v-turbo
|
7 |
+
AnimateDiff,AnimateDiff,1041.1138517372308,1039.9705875487114,creativeml-openrail-m,"The Chinese University of Hong Kong, Shanghai AI Lab, Stanford University",https://fal.ai/models/fast-animatediff-t2v
|
8 |
+
VideoCrafter2,VideoCrafter2,1038.6609035771987,1039.442228627356,Apache 2.0,Tencent AI Lab,https://ailab-cvc.github.io/videocrafter2/
|
9 |
+
Allegro,Allegro,1011.62764027565,1014.1555579516963,Apache 2.0,rhymes-ai,https://github.com/rhymes-ai/Allegro
|
10 |
+
LaVie,LaVie,968.0112777795196,968.72992940921,Apache 2.0,Shanghai AI Lab,https://github.com/Vchitect/LaVie
|
11 |
+
OpenSora,OpenSora,884.2192842497221,884.717192070073,Apache 2.0,HPC-AI Tech,https://github.com/hpcaitech/Open-Sora
|
12 |
+
OpenSora v1.2,OpenSora v1.2,850.0337198093126,847.8764663237778,Apache 2.0,HPC-AI Tech,https://github.com/hpcaitech/Open-Sora
|
13 |
+
AnimateDiff Turbo,AnimateDiff Turbo,835.1752634927545,836.0388913181373,creativeml-openrail-m,"The Chinese University of Hong Kong, Shanghai AI Lab, Stanford University",https://fal.ai/models/fast-animatediff-t2v-turbo
|
14 |
+
ModelScope,ModelScope,834.0445724475059,836.0960273820054,cc-by-nc-4.0,Alibaba Group,https://arxiv.org/abs/2308.06571
|
arena_elo/results/latest/clean_battle_image_editing.json
CHANGED
@@ -19704,5 +19704,197 @@
|
|
19704 |
"judge": "arena_user_10.16.38.196",
|
19705 |
"anony": true,
|
19706 |
"tstamp": 1733435624.5129
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
19707 |
}
|
19708 |
]
|
|
|
19704 |
"judge": "arena_user_10.16.38.196",
|
19705 |
"anony": true,
|
19706 |
"tstamp": 1733435624.5129
|
19707 |
+
},
|
19708 |
+
{
|
19709 |
+
"model_a_conv_id": "0b466565b7564844ad67a7747c870380",
|
19710 |
+
"model_b_conv_id": "483aa5c17885460e88fb055da225ba7b",
|
19711 |
+
"inputs": {
|
19712 |
+
"source_prompt": "A piece of pie has bananas and whipped cream surrounding it on a white plate.",
|
19713 |
+
"target_prompt": "A piece of pie with bananas, whipped cream, and strawberries surrounding it on a white plate.",
|
19714 |
+
"instruct_prompt": "put strawberry on the plate"
|
19715 |
+
},
|
19716 |
+
"model_a": "CosXLEdit",
|
19717 |
+
"model_b": "InfEdit",
|
19718 |
+
"vote_type": "leftvote",
|
19719 |
+
"winner": "model_a",
|
19720 |
+
"judge": "arena_user_10.16.16.234",
|
19721 |
+
"anony": true,
|
19722 |
+
"tstamp": 1733850827.1895
|
19723 |
+
},
|
19724 |
+
{
|
19725 |
+
"model_a_conv_id": "6f10357fa6e24b07959552529542a741",
|
19726 |
+
"model_b_conv_id": "934b95231f2944279430bf17058ccc87",
|
19727 |
+
"inputs": {
|
19728 |
+
"source_prompt": "The couch and table were in the living room.",
|
19729 |
+
"target_prompt": "The couch and aquarium were in the living room.",
|
19730 |
+
"instruct_prompt": "remove the table and add an aquarium"
|
19731 |
+
},
|
19732 |
+
"model_a": "MagicBrush",
|
19733 |
+
"model_b": "Prompt2prompt",
|
19734 |
+
"vote_type": "rightvote",
|
19735 |
+
"winner": "model_b",
|
19736 |
+
"judge": "arena_user_10.16.12.226",
|
19737 |
+
"anony": true,
|
19738 |
+
"tstamp": 1733850844.5397
|
19739 |
+
},
|
19740 |
+
{
|
19741 |
+
"model_a_conv_id": "b65af2335c5e4556bd1f3bb78aa2d104",
|
19742 |
+
"model_b_conv_id": "faeafe77b35348d7b927476f7c5972a3",
|
19743 |
+
"inputs": {
|
19744 |
+
"source_prompt": "A black and white picture of an intersection",
|
19745 |
+
"target_prompt": "A black and white picture of a policeman directing traffic at an intersection.",
|
19746 |
+
"instruct_prompt": "Put a policeman in the intersection."
|
19747 |
+
},
|
19748 |
+
"model_a": "CosXLEdit",
|
19749 |
+
"model_b": "Prompt2prompt",
|
19750 |
+
"vote_type": "rightvote",
|
19751 |
+
"winner": "model_b",
|
19752 |
+
"judge": "arena_user_10.16.3.13",
|
19753 |
+
"anony": true,
|
19754 |
+
"tstamp": 1733850859.7442
|
19755 |
+
},
|
19756 |
+
{
|
19757 |
+
"model_a_conv_id": "eefa8333c9814ca19280f82f8cc5edcc",
|
19758 |
+
"model_b_conv_id": "f80047d49d4e4f5ab10f6818eafcca63",
|
19759 |
+
"inputs": {
|
19760 |
+
"source_prompt": "a dinner table set with many plates and spoons and forks",
|
19761 |
+
"target_prompt": "A gorilla joins the dinner party, surrounded by plates, spoons, and forks.",
|
19762 |
+
"instruct_prompt": "Have a gorilla sit at the dinner table."
|
19763 |
+
},
|
19764 |
+
"model_a": "Prompt2prompt",
|
19765 |
+
"model_b": "CosXLEdit",
|
19766 |
+
"vote_type": "bothbad_vote",
|
19767 |
+
"winner": "tie (bothbad)",
|
19768 |
+
"judge": "arena_user_10.16.12.226",
|
19769 |
+
"anony": true,
|
19770 |
+
"tstamp": 1733850876.7719
|
19771 |
+
},
|
19772 |
+
{
|
19773 |
+
"model_a_conv_id": "57a28ea5b428481aa1224ab4d3240413",
|
19774 |
+
"model_b_conv_id": "752e0c30b3fe45f190f0302c91ef123b",
|
19775 |
+
"inputs": {
|
19776 |
+
"source_prompt": "a bear walks on a rocky surface",
|
19777 |
+
"target_prompt": "A bear and a robot tiger on a rocky surface",
|
19778 |
+
"instruct_prompt": "put a robot tiger next to the bear"
|
19779 |
+
},
|
19780 |
+
"model_a": "SDEdit",
|
19781 |
+
"model_b": "UltraEdit",
|
19782 |
+
"vote_type": "bothbad_vote",
|
19783 |
+
"winner": "tie (bothbad)",
|
19784 |
+
"judge": "arena_user_10.16.39.72",
|
19785 |
+
"anony": true,
|
19786 |
+
"tstamp": 1733930447.697
|
19787 |
+
},
|
19788 |
+
{
|
19789 |
+
"model_a_conv_id": "7cf1ca2173694917baaf88700fbc13c1",
|
19790 |
+
"model_b_conv_id": "553d947b4c0e4c4d85f23910b880141e",
|
19791 |
+
"inputs": {
|
19792 |
+
"source_prompt": "a large elephant that is standing up eating",
|
19793 |
+
"target_prompt": "A large elephant standing up and enjoying a watermelon",
|
19794 |
+
"instruct_prompt": "He should be eating a watermelon"
|
19795 |
+
},
|
19796 |
+
"model_a": "Pix2PixZero",
|
19797 |
+
"model_b": "UltraEdit",
|
19798 |
+
"vote_type": "rightvote",
|
19799 |
+
"winner": "model_b",
|
19800 |
+
"judge": "arena_user_10.16.0.161",
|
19801 |
+
"anony": true,
|
19802 |
+
"tstamp": 1733930477.3696
|
19803 |
+
},
|
19804 |
+
{
|
19805 |
+
"model_a_conv_id": "07fdbdbb36994e27a24c1b4b584d7c43",
|
19806 |
+
"model_b_conv_id": "721763d5d0ce4b7097549a199f1a2252",
|
19807 |
+
"inputs": {
|
19808 |
+
"source_prompt": "A man tries to stand on his surf board in the water.",
|
19809 |
+
"target_prompt": "A man tries to stand on his surf board as a dolphin jumps out of the water.",
|
19810 |
+
"instruct_prompt": "Have there be a dolphin jumping out of the water"
|
19811 |
+
},
|
19812 |
+
"model_a": "SDEdit",
|
19813 |
+
"model_b": "InfEdit",
|
19814 |
+
"vote_type": "bothbad_vote",
|
19815 |
+
"winner": "tie (bothbad)",
|
19816 |
+
"judge": "arena_user_10.16.39.72",
|
19817 |
+
"anony": true,
|
19818 |
+
"tstamp": 1733930499.4816
|
19819 |
+
},
|
19820 |
+
{
|
19821 |
+
"model_a_conv_id": "e3f74b303aa64c92bad350448db8ed10",
|
19822 |
+
"model_b_conv_id": "6dc510043f254181af66ef3177ee3db1",
|
19823 |
+
"inputs": {
|
19824 |
+
"source_prompt": "A small black dog with a newspaper in its mouth.",
|
19825 |
+
"target_prompt": "A small black dog with a newspaper in its mouth next to a potted plant.",
|
19826 |
+
"instruct_prompt": "let there be potted plant"
|
19827 |
+
},
|
19828 |
+
"model_a": "UltraEdit",
|
19829 |
+
"model_b": "CycleDiffusion",
|
19830 |
+
"vote_type": "bothbad_vote",
|
19831 |
+
"winner": "tie (bothbad)",
|
19832 |
+
"judge": "arena_user_10.16.39.72",
|
19833 |
+
"anony": true,
|
19834 |
+
"tstamp": 1733930518.3777
|
19835 |
+
},
|
19836 |
+
{
|
19837 |
+
"model_a_conv_id": "9f083ce8860a4aca8adc5f394244ad04",
|
19838 |
+
"model_b_conv_id": "0d90f318b5f14953a7db09ee5c62d707",
|
19839 |
+
"inputs": {
|
19840 |
+
"source_prompt": "A small black dog playing with a soccer ball.",
|
19841 |
+
"target_prompt": "A small black dog with a newspaper in its mouth.",
|
19842 |
+
"instruct_prompt": "let the dog have a newspaper in its mouth"
|
19843 |
+
},
|
19844 |
+
"model_a": "Prompt2prompt",
|
19845 |
+
"model_b": "Pix2PixZero",
|
19846 |
+
"vote_type": "rightvote",
|
19847 |
+
"winner": "model_b",
|
19848 |
+
"judge": "arena_user_10.16.39.72",
|
19849 |
+
"anony": true,
|
19850 |
+
"tstamp": 1733930541.0286
|
19851 |
+
},
|
19852 |
+
{
|
19853 |
+
"model_a_conv_id": "d473cbddc42341d3ade0e4fa63ecf32d",
|
19854 |
+
"model_b_conv_id": "836177aa444849fd817dc852c46e5250",
|
19855 |
+
"inputs": {
|
19856 |
+
"source_prompt": "This truck is loaded with two freezers and tied down securely.",
|
19857 |
+
"target_prompt": "This truck with open door is loaded with two freezers and tied down securely.",
|
19858 |
+
"instruct_prompt": "open the door of the truck"
|
19859 |
+
},
|
19860 |
+
"model_a": "UltraEdit",
|
19861 |
+
"model_b": "Pix2PixZero",
|
19862 |
+
"vote_type": "bothbad_vote",
|
19863 |
+
"winner": "tie (bothbad)",
|
19864 |
+
"judge": "arena_user_10.16.16.234",
|
19865 |
+
"anony": true,
|
19866 |
+
"tstamp": 1733930576.7066
|
19867 |
+
},
|
19868 |
+
{
|
19869 |
+
"model_a_conv_id": "d602b36573834089ad06c29134149178",
|
19870 |
+
"model_b_conv_id": "f28150b3101a4bcca57589c3a0f5fada",
|
19871 |
+
"inputs": {
|
19872 |
+
"source_prompt": "there are many people riding this train together",
|
19873 |
+
"target_prompt": "There are many angry people riding this train together",
|
19874 |
+
"instruct_prompt": "make the people angry"
|
19875 |
+
},
|
19876 |
+
"model_a": "Prompt2prompt",
|
19877 |
+
"model_b": "SDEdit",
|
19878 |
+
"vote_type": "leftvote",
|
19879 |
+
"winner": "model_a",
|
19880 |
+
"judge": "arena_user_10.16.0.161",
|
19881 |
+
"anony": true,
|
19882 |
+
"tstamp": 1733930611.6987
|
19883 |
+
},
|
19884 |
+
{
|
19885 |
+
"model_a_conv_id": "8494f6ecf56046778b28cbe67a1c5c70",
|
19886 |
+
"model_b_conv_id": "d545e47078664c93b5751c7238a405f7",
|
19887 |
+
"inputs": {
|
19888 |
+
"source_prompt": "A man scuba diving down a snow covered conifer hillside.",
|
19889 |
+
"target_prompt": "A man scuba diving with a polar bear down a snow covered conifer hillside.",
|
19890 |
+
"instruct_prompt": "add a polar bear"
|
19891 |
+
},
|
19892 |
+
"model_a": "MagicBrush",
|
19893 |
+
"model_b": "Pix2PixZero",
|
19894 |
+
"vote_type": "leftvote",
|
19895 |
+
"winner": "model_a",
|
19896 |
+
"judge": "arena_user_10.16.16.234",
|
19897 |
+
"anony": true,
|
19898 |
+
"tstamp": 1733930669.9543
|
19899 |
}
|
19900 |
]
|
arena_elo/results/latest/clean_battle_t2i_generation.json
CHANGED
@@ -111028,5 +111028,397 @@
|
|
111028 |
"judge": "arena_user_10.20.38.195",
|
111029 |
"anony": true,
|
111030 |
"tstamp": 1733832205.6976
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
111031 |
}
|
111032 |
]
|
|
|
111028 |
"judge": "arena_user_10.20.38.195",
|
111029 |
"anony": true,
|
111030 |
"tstamp": 1733832205.6976
|
111031 |
+
},
|
111032 |
+
{
|
111033 |
+
"model_a_conv_id": "ff055b5cf4e44d35be512582dc908962",
|
111034 |
+
"model_b_conv_id": "ba2816b514534318b58110124488bb2d",
|
111035 |
+
"inputs": {
|
111036 |
+
"prompt": "hyperrealism render of a surreal alien humanoid"
|
111037 |
+
},
|
111038 |
+
"model_a": "Kolors",
|
111039 |
+
"model_b": "SDXL",
|
111040 |
+
"vote_type": "bothbad_vote",
|
111041 |
+
"winner": "tie (bothbad)",
|
111042 |
+
"judge": "arena_user_10.16.16.234",
|
111043 |
+
"anony": true,
|
111044 |
+
"tstamp": 1733850560.6897
|
111045 |
+
},
|
111046 |
+
{
|
111047 |
+
"model_a_conv_id": "34667869768b401da2ca154e2656962e",
|
111048 |
+
"model_b_conv_id": "91b7f06cc8b04a5f9c5a0122d8e5ea08",
|
111049 |
+
"inputs": {
|
111050 |
+
"prompt": "Abraham Lincoln touches his toes while George Washington does chin-ups. Lincoln is barefoot. Washington is wearing boots."
|
111051 |
+
},
|
111052 |
+
"model_a": "SDXL-Lightning",
|
111053 |
+
"model_b": "OpenJourney",
|
111054 |
+
"vote_type": "leftvote",
|
111055 |
+
"winner": "model_a",
|
111056 |
+
"judge": "arena_user_10.16.38.196",
|
111057 |
+
"anony": true,
|
111058 |
+
"tstamp": 1733850568.1557
|
111059 |
+
},
|
111060 |
+
{
|
111061 |
+
"model_a_conv_id": "7544c242cbb8493f858642c6671f3ca7",
|
111062 |
+
"model_b_conv_id": "055382d7d1334902b0329c58920e0f0c",
|
111063 |
+
"inputs": {
|
111064 |
+
"prompt": "minecraft cosmic fantasy wallpaper synthwave mix title card, party rpg, 4 k "
|
111065 |
+
},
|
111066 |
+
"model_a": "StableCascade",
|
111067 |
+
"model_b": "SDXL",
|
111068 |
+
"vote_type": "rightvote",
|
111069 |
+
"winner": "model_b",
|
111070 |
+
"judge": "arena_user_10.16.38.196",
|
111071 |
+
"anony": true,
|
111072 |
+
"tstamp": 1733850580.4251
|
111073 |
+
},
|
111074 |
+
{
|
111075 |
+
"model_a_conv_id": "63ef5258fc1942f6988a6c964ae4837d",
|
111076 |
+
"model_b_conv_id": "76be29c8ad0744edb88ad6bd77b60526",
|
111077 |
+
"inputs": {
|
111078 |
+
"prompt": "Rbefraigerator."
|
111079 |
+
},
|
111080 |
+
"model_a": "OpenJourney",
|
111081 |
+
"model_b": "PixArtSigma",
|
111082 |
+
"vote_type": "tievote",
|
111083 |
+
"winner": "tie",
|
111084 |
+
"judge": "arena_user_10.16.12.226",
|
111085 |
+
"anony": true,
|
111086 |
+
"tstamp": 1733850601.6417
|
111087 |
+
},
|
111088 |
+
{
|
111089 |
+
"model_a_conv_id": "533cc5e0255f4501a1bea983c4adaf1a",
|
111090 |
+
"model_b_conv_id": "6f2a39aa7e73486cabf4ad85cc75fc48",
|
111091 |
+
"inputs": {
|
111092 |
+
"prompt": "photorealistic, detailed mechanical motorcycle, detailed"
|
111093 |
+
},
|
111094 |
+
"model_a": "FLUX.1-schnell",
|
111095 |
+
"model_b": "OpenJourney",
|
111096 |
+
"vote_type": "leftvote",
|
111097 |
+
"winner": "model_a",
|
111098 |
+
"judge": "arena_user_10.16.16.234",
|
111099 |
+
"anony": true,
|
111100 |
+
"tstamp": 1733850608.2495
|
111101 |
+
},
|
111102 |
+
{
|
111103 |
+
"model_a_conv_id": "71020e6c7e8a4c2f80b66b52729e2e61",
|
111104 |
+
"model_b_conv_id": "72de9bae58724c638825907c924ac401",
|
111105 |
+
"inputs": {
|
111106 |
+
"prompt": "complex 3 d render hyper detailed ultra sharp aesthetic house on fire, medium portrait, close - up, bright 3 point light "
|
111107 |
+
},
|
111108 |
+
"model_a": "SDXL",
|
111109 |
+
"model_b": "SD3",
|
111110 |
+
"vote_type": "leftvote",
|
111111 |
+
"winner": "model_a",
|
111112 |
+
"judge": "arena_user_10.16.12.226",
|
111113 |
+
"anony": true,
|
111114 |
+
"tstamp": 1733850613.145
|
111115 |
+
},
|
111116 |
+
{
|
111117 |
+
"model_a_conv_id": "a45ccb098414433d8149064459592a11",
|
111118 |
+
"model_b_conv_id": "44d6ec08afd241bea8722064188e0aa6",
|
111119 |
+
"inputs": {
|
111120 |
+
"prompt": "One cat and two dogs sitting on the grass."
|
111121 |
+
},
|
111122 |
+
"model_a": "OpenJourney",
|
111123 |
+
"model_b": "SDXL-Lightning",
|
111124 |
+
"vote_type": "rightvote",
|
111125 |
+
"winner": "model_b",
|
111126 |
+
"judge": "arena_user_10.16.39.72",
|
111127 |
+
"anony": true,
|
111128 |
+
"tstamp": 1733850620.4167
|
111129 |
+
},
|
111130 |
+
{
|
111131 |
+
"model_a_conv_id": "16fb779122e6440f877e33ddb8b49c79",
|
111132 |
+
"model_b_conv_id": "cf1511f738b749dd88deadc5a80f61d4",
|
111133 |
+
"inputs": {
|
111134 |
+
"prompt": "A baseball player in a blue and white uniform is next to a player in black and white ."
|
111135 |
+
},
|
111136 |
+
"model_a": "OpenJourney",
|
111137 |
+
"model_b": "SD3",
|
111138 |
+
"vote_type": "rightvote",
|
111139 |
+
"winner": "model_b",
|
111140 |
+
"judge": "arena_user_10.16.39.72",
|
111141 |
+
"anony": true,
|
111142 |
+
"tstamp": 1733850624.8608
|
111143 |
+
},
|
111144 |
+
{
|
111145 |
+
"model_a_conv_id": "bed29a7fe6d3484a85f14ec3b1681bda",
|
111146 |
+
"model_b_conv_id": "d8633b0f0f8246ddaca54561b1024387",
|
111147 |
+
"inputs": {
|
111148 |
+
"prompt": "Three cats and one dog sitting on the grass."
|
111149 |
+
},
|
111150 |
+
"model_a": "Kolors",
|
111151 |
+
"model_b": "PlayGround V2.5",
|
111152 |
+
"vote_type": "leftvote",
|
111153 |
+
"winner": "model_a",
|
111154 |
+
"judge": "arena_user_10.16.38.196",
|
111155 |
+
"anony": true,
|
111156 |
+
"tstamp": 1733850631.2754
|
111157 |
+
},
|
111158 |
+
{
|
111159 |
+
"model_a_conv_id": "ff9846379c094e81af30249c5193aa94",
|
111160 |
+
"model_b_conv_id": "86dba607d80b4e2a95b688efe007f29d",
|
111161 |
+
"inputs": {
|
111162 |
+
"prompt": "Artophagous."
|
111163 |
+
},
|
111164 |
+
"model_a": "FLUX.1-dev",
|
111165 |
+
"model_b": "StableCascade",
|
111166 |
+
"vote_type": "rightvote",
|
111167 |
+
"winner": "model_b",
|
111168 |
+
"judge": "arena_user_10.16.3.13",
|
111169 |
+
"anony": true,
|
111170 |
+
"tstamp": 1733850636.8782
|
111171 |
+
},
|
111172 |
+
{
|
111173 |
+
"model_a_conv_id": "5de16ac112ba43d189a6694b95cc3c58",
|
111174 |
+
"model_b_conv_id": "63985f1097f548429f3a5c2ddfac1359",
|
111175 |
+
"inputs": {
|
111176 |
+
"prompt": "complex 3 d render hyper detailed ultra sharp aesthetic house on fire, medium portrait, close - up, bright 3 point light "
|
111177 |
+
},
|
111178 |
+
"model_a": "SD3",
|
111179 |
+
"model_b": "AuraFlow",
|
111180 |
+
"vote_type": "leftvote",
|
111181 |
+
"winner": "model_a",
|
111182 |
+
"judge": "arena_user_10.16.16.234",
|
111183 |
+
"anony": true,
|
111184 |
+
"tstamp": 1733850641.982
|
111185 |
+
},
|
111186 |
+
{
|
111187 |
+
"model_a_conv_id": "c8e1ed4a383f45e8a4ab7d4e4f7892ed",
|
111188 |
+
"model_b_conv_id": "71d6d9addaac4ec9982eb6d76ca8a45d",
|
111189 |
+
"inputs": {
|
111190 |
+
"prompt": "A large red building with a clock in it is surrounded by palm trees and white flags ."
|
111191 |
+
},
|
111192 |
+
"model_a": "PixArtAlpha",
|
111193 |
+
"model_b": "AuraFlow",
|
111194 |
+
"vote_type": "leftvote",
|
111195 |
+
"winner": "model_a",
|
111196 |
+
"judge": "arena_user_10.16.39.72",
|
111197 |
+
"anony": true,
|
111198 |
+
"tstamp": 1733850646.6476
|
111199 |
+
},
|
111200 |
+
{
|
111201 |
+
"model_a_conv_id": "a12dd11c4e344cc681ca32c5c93d22fd",
|
111202 |
+
"model_b_conv_id": "2e43eeb457024f8b965948fead54350e",
|
111203 |
+
"inputs": {
|
111204 |
+
"prompt": "hyperrealism, epic photography, closeup, 35mm film, photography, of young girl, in city"
|
111205 |
+
},
|
111206 |
+
"model_a": "SDXLTurbo",
|
111207 |
+
"model_b": "HunyuanDiT",
|
111208 |
+
"vote_type": "rightvote",
|
111209 |
+
"winner": "model_b",
|
111210 |
+
"judge": "arena_user_10.16.38.196",
|
111211 |
+
"anony": true,
|
111212 |
+
"tstamp": 1733850651.4493
|
111213 |
+
},
|
111214 |
+
{
|
111215 |
+
"model_a_conv_id": "bfb0b5a2242545f780d25677ca6b0797",
|
111216 |
+
"model_b_conv_id": "19e9b2f9a6d94efeba756f1c2c9539eb",
|
111217 |
+
"inputs": {
|
111218 |
+
"prompt": "Supreme Court Justices play a baseball game with the FBI. The FBI is at bat, the justices are on the field."
|
111219 |
+
},
|
111220 |
+
"model_a": "SD3",
|
111221 |
+
"model_b": "Kolors",
|
111222 |
+
"vote_type": "bothbad_vote",
|
111223 |
+
"winner": "tie (bothbad)",
|
111224 |
+
"judge": "arena_user_10.16.16.234",
|
111225 |
+
"anony": true,
|
111226 |
+
"tstamp": 1733850658.3228
|
111227 |
+
},
|
111228 |
+
{
|
111229 |
+
"model_a_conv_id": "b3f054ecc3b7438eb88665373f65905d",
|
111230 |
+
"model_b_conv_id": "23f1e6cdd00a45528f85f3221d333665",
|
111231 |
+
"inputs": {
|
111232 |
+
"prompt": "A pink scooter with a black seat next to a blue car."
|
111233 |
+
},
|
111234 |
+
"model_a": "SDXLTurbo",
|
111235 |
+
"model_b": "SDXL-Lightning",
|
111236 |
+
"vote_type": "rightvote",
|
111237 |
+
"winner": "model_b",
|
111238 |
+
"judge": "arena_user_10.16.39.72",
|
111239 |
+
"anony": true,
|
111240 |
+
"tstamp": 1733850667.3173
|
111241 |
+
},
|
111242 |
+
{
|
111243 |
+
"model_a_conv_id": "eda1196bcf71454ba5e9321aafed9c1f",
|
111244 |
+
"model_b_conv_id": "91af09adc74d438a810276c1ae973575",
|
111245 |
+
"inputs": {
|
111246 |
+
"prompt": "Darth Vader playing with raccoon in Mars during sunset."
|
111247 |
+
},
|
111248 |
+
"model_a": "FLUX.1-dev",
|
111249 |
+
"model_b": "PixArtSigma",
|
111250 |
+
"vote_type": "leftvote",
|
111251 |
+
"winner": "model_a",
|
111252 |
+
"judge": "arena_user_10.16.39.72",
|
111253 |
+
"anony": true,
|
111254 |
+
"tstamp": 1733850674.7327
|
111255 |
+
},
|
111256 |
+
{
|
111257 |
+
"model_a_conv_id": "6b7b890fa8214318b5165c773451b683",
|
111258 |
+
"model_b_conv_id": "6be8f4cbd24544a5aacea1245f8139df",
|
111259 |
+
"inputs": {
|
111260 |
+
"prompt": "half life 3 "
|
111261 |
+
},
|
111262 |
+
"model_a": "FLUX.1-schnell",
|
111263 |
+
"model_b": "SDXLTurbo",
|
111264 |
+
"vote_type": "leftvote",
|
111265 |
+
"winner": "model_a",
|
111266 |
+
"judge": "arena_user_10.16.39.72",
|
111267 |
+
"anony": true,
|
111268 |
+
"tstamp": 1733850680.796
|
111269 |
+
},
|
111270 |
+
{
|
111271 |
+
"model_a_conv_id": "270bad1e1a7b4907a1195cd499431fa4",
|
111272 |
+
"model_b_conv_id": "d9385c6d1ee8487eb06eeb6ba7d767a1",
|
111273 |
+
"inputs": {
|
111274 |
+
"prompt": "half life 3 "
|
111275 |
+
},
|
111276 |
+
"model_a": "SDXL",
|
111277 |
+
"model_b": "AuraFlow",
|
111278 |
+
"vote_type": "bothbad_vote",
|
111279 |
+
"winner": "tie (bothbad)",
|
111280 |
+
"judge": "arena_user_10.16.3.13",
|
111281 |
+
"anony": true,
|
111282 |
+
"tstamp": 1733850690.3954
|
111283 |
+
},
|
111284 |
+
{
|
111285 |
+
"model_a_conv_id": "1d7bcc7aa6c1441497b395a22dd77546",
|
111286 |
+
"model_b_conv_id": "f8208616aeb041e194a03e53c9224b3c",
|
111287 |
+
"inputs": {
|
111288 |
+
"prompt": "Half Life 3"
|
111289 |
+
},
|
111290 |
+
"model_a": "PixArtAlpha",
|
111291 |
+
"model_b": "AuraFlow",
|
111292 |
+
"vote_type": "leftvote",
|
111293 |
+
"winner": "model_a",
|
111294 |
+
"judge": "arena_user_10.16.39.72",
|
111295 |
+
"anony": true,
|
111296 |
+
"tstamp": 1733850814.1133
|
111297 |
+
},
|
111298 |
+
{
|
111299 |
+
"model_a_conv_id": "1476a7eff06e4ed98bd8cbc85cec42c4",
|
111300 |
+
"model_b_conv_id": "26a40e419e31432c991c93d4fef5446c",
|
111301 |
+
"inputs": {
|
111302 |
+
"prompt": "checkerboard "
|
111303 |
+
},
|
111304 |
+
"model_a": "OpenJourney",
|
111305 |
+
"model_b": "PixArtSigma",
|
111306 |
+
"vote_type": "rightvote",
|
111307 |
+
"winner": "model_b",
|
111308 |
+
"judge": "arena_user_10.16.39.72",
|
111309 |
+
"anony": true,
|
111310 |
+
"tstamp": 1733860574.4419
|
111311 |
+
},
|
111312 |
+
{
|
111313 |
+
"model_a_conv_id": "d4be8859da164539bd57be4a8a386fe8",
|
111314 |
+
"model_b_conv_id": "0df51cc70a03475a8064e8ed0f53bd85",
|
111315 |
+
"inputs": {
|
111316 |
+
"prompt": "A realistic photo of a Pomeranian dressed up like a 1980s professional wrestler with neon green and neon orange face paint and bright green wrestling tights with bright orange boots."
|
111317 |
+
},
|
111318 |
+
"model_a": "StableCascade",
|
111319 |
+
"model_b": "FLUX.1-schnell",
|
111320 |
+
"vote_type": "leftvote",
|
111321 |
+
"winner": "model_a",
|
111322 |
+
"judge": "arena_user_10.20.38.195",
|
111323 |
+
"anony": true,
|
111324 |
+
"tstamp": 1733886164.0257
|
111325 |
+
},
|
111326 |
+
{
|
111327 |
+
"model_a_conv_id": "78242c912de249a385071b52f5ea29b7",
|
111328 |
+
"model_b_conv_id": "533bd7d5f98842e8a3794ffab3f8c70e",
|
111329 |
+
"inputs": {
|
111330 |
+
"prompt": "emperor palpatine in the desert of tatooine, film still, wide shot, heat, desert, sci fi, epic, dramatic light "
|
111331 |
+
},
|
111332 |
+
"model_a": "SDXL-Lightning",
|
111333 |
+
"model_b": "AuraFlow",
|
111334 |
+
"vote_type": "leftvote",
|
111335 |
+
"winner": "model_a",
|
111336 |
+
"judge": "arena_user_10.20.26.107",
|
111337 |
+
"anony": true,
|
111338 |
+
"tstamp": 1733886180.662
|
111339 |
+
},
|
111340 |
+
{
|
111341 |
+
"model_a_conv_id": "1c05471dae6047109a63da2e6c9b669d",
|
111342 |
+
"model_b_conv_id": "838c56ac2efc4a7c811f1e6f317ebc18",
|
111343 |
+
"inputs": {
|
111344 |
+
"prompt": "A clean bathroom with pink walls and white appliances."
|
111345 |
+
},
|
111346 |
+
"model_a": "SDXL",
|
111347 |
+
"model_b": "SD3",
|
111348 |
+
"vote_type": "leftvote",
|
111349 |
+
"winner": "model_a",
|
111350 |
+
"judge": "arena_user_10.20.38.195",
|
111351 |
+
"anony": true,
|
111352 |
+
"tstamp": 1733886205.2213
|
111353 |
+
},
|
111354 |
+
{
|
111355 |
+
"model_a_conv_id": "1cae974e31e84b63838e4172425fad84",
|
111356 |
+
"model_b_conv_id": "553bf52497254e95ba5bcea3e7c9c511",
|
111357 |
+
"inputs": {
|
111358 |
+
"prompt": "Big Man Big muscles naked in pool Big pénis"
|
111359 |
+
},
|
111360 |
+
"model_a": "SD3",
|
111361 |
+
"model_b": "PlayGround V2.5",
|
111362 |
+
"vote_type": "bothbad_vote",
|
111363 |
+
"winner": "tie (bothbad)",
|
111364 |
+
"judge": "arena_user_10.20.38.195",
|
111365 |
+
"anony": true,
|
111366 |
+
"tstamp": 1733906181.8302
|
111367 |
+
},
|
111368 |
+
{
|
111369 |
+
"model_a_conv_id": "9a029080a74942b2b23b307a796e3334",
|
111370 |
+
"model_b_conv_id": "cc403f74da5443eeadf3837c79e9f7d4",
|
111371 |
+
"inputs": {
|
111372 |
+
"prompt": "a cute dog is playing a ball"
|
111373 |
+
},
|
111374 |
+
"model_a": "FLUX.1-schnell",
|
111375 |
+
"model_b": "SDXL",
|
111376 |
+
"vote_type": "tievote",
|
111377 |
+
"winner": "tie",
|
111378 |
+
"judge": "arena_user_10.16.39.72",
|
111379 |
+
"anony": true,
|
111380 |
+
"tstamp": 1733921488.5334
|
111381 |
+
},
|
111382 |
+
{
|
111383 |
+
"model_a_conv_id": "babe01c5a02f49b48bc4274c666b5e1a",
|
111384 |
+
"model_b_conv_id": "55f878ff2e404e5e82a49cd3b5501a00",
|
111385 |
+
"inputs": {
|
111386 |
+
"prompt": "photorealistic gemstone skull"
|
111387 |
+
},
|
111388 |
+
"model_a": "AuraFlow",
|
111389 |
+
"model_b": "FLUX.1-dev",
|
111390 |
+
"vote_type": "rightvote",
|
111391 |
+
"winner": "model_b",
|
111392 |
+
"judge": "arena_user_10.16.16.234",
|
111393 |
+
"anony": false,
|
111394 |
+
"tstamp": 1733930334.0214
|
111395 |
+
},
|
111396 |
+
{
|
111397 |
+
"model_a_conv_id": "d02a24ee211a4b59b724e0f8e3ea0b46",
|
111398 |
+
"model_b_conv_id": "a08dc2717af74048a15dea38aa2fd48a",
|
111399 |
+
"inputs": {
|
111400 |
+
"prompt": "Supreme Court Justices play a baseball game with the FBI. The FBI is at bat, the justices are on the field."
|
111401 |
+
},
|
111402 |
+
"model_a": "AuraFlow",
|
111403 |
+
"model_b": "FLUX.1-dev",
|
111404 |
+
"vote_type": "bothbad_vote",
|
111405 |
+
"winner": "tie (bothbad)",
|
111406 |
+
"judge": "arena_user_10.16.19.14",
|
111407 |
+
"anony": false,
|
111408 |
+
"tstamp": 1733930365.02
|
111409 |
+
},
|
111410 |
+
{
|
111411 |
+
"model_a_conv_id": "dd977b4df7014267a46611534f3e7d06",
|
111412 |
+
"model_b_conv_id": "5d633fe106b44049add8782ca3a1189c",
|
111413 |
+
"inputs": {
|
111414 |
+
"prompt": "A storefront with 'Diffusion' written on it."
|
111415 |
+
},
|
111416 |
+
"model_a": "AuraFlow",
|
111417 |
+
"model_b": "FLUX.1-dev",
|
111418 |
+
"vote_type": "tievote",
|
111419 |
+
"winner": "tie",
|
111420 |
+
"judge": "arena_user_10.16.19.14",
|
111421 |
+
"anony": false,
|
111422 |
+
"tstamp": 1733930378.7836
|
111423 |
}
|
111424 |
]
|
arena_elo/results/latest/clean_battle_video_generation.json
CHANGED
@@ -34074,5 +34074,201 @@
|
|
34074 |
"judge": "arena_user_10.20.38.195",
|
34075 |
"anony": true,
|
34076 |
"tstamp": 1733782200.9877
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
34077 |
}
|
34078 |
]
|
|
|
34074 |
"judge": "arena_user_10.20.38.195",
|
34075 |
"anony": true,
|
34076 |
"tstamp": 1733782200.9877
|
34077 |
+
},
|
34078 |
+
{
|
34079 |
+
"model_a_conv_id": "372ed16c610d40f6ad84b2df5dfdfe3e",
|
34080 |
+
"model_b_conv_id": "a61b7eb22940438a8bf1dbc87291ad89",
|
34081 |
+
"inputs": {
|
34082 |
+
"prompt": "a potted plant and a tv"
|
34083 |
+
},
|
34084 |
+
"model_a": "OpenSora v1.2",
|
34085 |
+
"model_b": "T2V-Turbo",
|
34086 |
+
"vote_type": "rightvote",
|
34087 |
+
"winner": "model_b",
|
34088 |
+
"judge": "arena_user_10.16.38.196",
|
34089 |
+
"anony": false,
|
34090 |
+
"tstamp": 1733860641.9496
|
34091 |
+
},
|
34092 |
+
{
|
34093 |
+
"model_a_conv_id": "8e442be7dd3745edb530a2443a28a200",
|
34094 |
+
"model_b_conv_id": "8061ac8337814d3696c2bc81812600de",
|
34095 |
+
"inputs": {
|
34096 |
+
"prompt": "a bowl and a remote"
|
34097 |
+
},
|
34098 |
+
"model_a": "AnimateDiff",
|
34099 |
+
"model_b": "AnimateDiff Turbo",
|
34100 |
+
"vote_type": "bothbad_vote",
|
34101 |
+
"winner": "tie (bothbad)",
|
34102 |
+
"judge": "arena_user_10.20.26.107",
|
34103 |
+
"anony": true,
|
34104 |
+
"tstamp": 1733903994.1779
|
34105 |
+
},
|
34106 |
+
{
|
34107 |
+
"model_a_conv_id": "2a34184a33d94fdfb9fcc2925c7aa27a",
|
34108 |
+
"model_b_conv_id": "20a770572470473aabaea58320388006",
|
34109 |
+
"inputs": {
|
34110 |
+
"prompt": "a tie"
|
34111 |
+
},
|
34112 |
+
"model_a": "LTXVideo",
|
34113 |
+
"model_b": "StableVideoDiffusion",
|
34114 |
+
"vote_type": "rightvote",
|
34115 |
+
"winner": "model_b",
|
34116 |
+
"judge": "arena_user_10.20.38.195",
|
34117 |
+
"anony": true,
|
34118 |
+
"tstamp": 1733904014.639
|
34119 |
+
},
|
34120 |
+
{
|
34121 |
+
"model_a_conv_id": "d657a44c112144448281a9a94837082f",
|
34122 |
+
"model_b_conv_id": "2a59035a5ed74984af3d3db90829a640",
|
34123 |
+
"inputs": {
|
34124 |
+
"prompt": "A beautiful coastal beach in spring, waves lapping on sand by Hokusai, in the style of Ukiyo"
|
34125 |
+
},
|
34126 |
+
"model_a": "Mochi1",
|
34127 |
+
"model_b": "CogVideoX-5B",
|
34128 |
+
"vote_type": "leftvote",
|
34129 |
+
"winner": "model_a",
|
34130 |
+
"judge": "arena_user_10.20.26.107",
|
34131 |
+
"anony": true,
|
34132 |
+
"tstamp": 1733904023.9531
|
34133 |
+
},
|
34134 |
+
{
|
34135 |
+
"model_a_conv_id": "92ab9c357ff84c7a9b969afe5f7f1bf6",
|
34136 |
+
"model_b_conv_id": "6fc28301299d4415988eef6c5b470714",
|
34137 |
+
"inputs": {
|
34138 |
+
"prompt": "a boat slowing down to stop"
|
34139 |
+
},
|
34140 |
+
"model_a": "CogVideoX-5B",
|
34141 |
+
"model_b": "AnimateDiff Turbo",
|
34142 |
+
"vote_type": "leftvote",
|
34143 |
+
"winner": "model_a",
|
34144 |
+
"judge": "arena_user_10.20.38.195",
|
34145 |
+
"anony": true,
|
34146 |
+
"tstamp": 1733904076.5845
|
34147 |
+
},
|
34148 |
+
{
|
34149 |
+
"model_a_conv_id": "57817b1a87594bf485ff315ba4dfdbe2",
|
34150 |
+
"model_b_conv_id": "8d08f2450caa4ab686ced96991c24f8d",
|
34151 |
+
"inputs": {
|
34152 |
+
"prompt": "A person is push up"
|
34153 |
+
},
|
34154 |
+
"model_a": "T2V-Turbo",
|
34155 |
+
"model_b": "VideoCrafter2",
|
34156 |
+
"vote_type": "bothbad_vote",
|
34157 |
+
"winner": "tie (bothbad)",
|
34158 |
+
"judge": "arena_user_10.20.26.107",
|
34159 |
+
"anony": true,
|
34160 |
+
"tstamp": 1733904092.1556
|
34161 |
+
},
|
34162 |
+
{
|
34163 |
+
"model_a_conv_id": "bf7505466a8042f7a0de5dcda2583d96",
|
34164 |
+
"model_b_conv_id": "95edf55854854205ae062885e971a172",
|
34165 |
+
"inputs": {
|
34166 |
+
"prompt": "Clown fish swimming through the coral reef"
|
34167 |
+
},
|
34168 |
+
"model_a": "LTXVideo",
|
34169 |
+
"model_b": "AnimateDiff Turbo",
|
34170 |
+
"vote_type": "leftvote",
|
34171 |
+
"winner": "model_a",
|
34172 |
+
"judge": "arena_user_10.20.38.195",
|
34173 |
+
"anony": true,
|
34174 |
+
"tstamp": 1733904102.2688
|
34175 |
+
},
|
34176 |
+
{
|
34177 |
+
"model_a_conv_id": "d779985634ae4bb191e0478557bc026d",
|
34178 |
+
"model_b_conv_id": "552579c2f8234e548a3f7d27e7355b35",
|
34179 |
+
"inputs": {
|
34180 |
+
"prompt": "A person is baby waking up"
|
34181 |
+
},
|
34182 |
+
"model_a": "Mochi1",
|
34183 |
+
"model_b": "LTXVideo",
|
34184 |
+
"vote_type": "bothbad_vote",
|
34185 |
+
"winner": "tie (bothbad)",
|
34186 |
+
"judge": "arena_user_10.16.12.226",
|
34187 |
+
"anony": false,
|
34188 |
+
"tstamp": 1733931931.3219
|
34189 |
+
},
|
34190 |
+
{
|
34191 |
+
"model_a_conv_id": "7c64b99f95724968bb6f0d888f50aa10",
|
34192 |
+
"model_b_conv_id": "aa484025c75945a69a349e8701ef8072",
|
34193 |
+
"inputs": {
|
34194 |
+
"prompt": "a pink suitcase"
|
34195 |
+
},
|
34196 |
+
"model_a": "Mochi1",
|
34197 |
+
"model_b": "LTXVideo",
|
34198 |
+
"vote_type": "leftvote",
|
34199 |
+
"winner": "model_a",
|
34200 |
+
"judge": "arena_user_10.16.0.161",
|
34201 |
+
"anony": false,
|
34202 |
+
"tstamp": 1733931942.2743
|
34203 |
+
},
|
34204 |
+
{
|
34205 |
+
"model_a_conv_id": "c657b824e56945e28fdfcb292691ee2f",
|
34206 |
+
"model_b_conv_id": "1edf534275564191b97a68247478654c",
|
34207 |
+
"inputs": {
|
34208 |
+
"prompt": "A shark swimming in clear Caribbean ocean"
|
34209 |
+
},
|
34210 |
+
"model_a": "Mochi1",
|
34211 |
+
"model_b": "LTXVideo",
|
34212 |
+
"vote_type": "leftvote",
|
34213 |
+
"winner": "model_a",
|
34214 |
+
"judge": "arena_user_10.16.16.234",
|
34215 |
+
"anony": false,
|
34216 |
+
"tstamp": 1733931952.837
|
34217 |
+
},
|
34218 |
+
{
|
34219 |
+
"model_a_conv_id": "15a4b72864904bc484df3d5b7714e08a",
|
34220 |
+
"model_b_conv_id": "c18dc07403294f03b3a48a9d8b3131bc",
|
34221 |
+
"inputs": {
|
34222 |
+
"prompt": "A person is cutting watermelon"
|
34223 |
+
},
|
34224 |
+
"model_a": "Mochi1",
|
34225 |
+
"model_b": "LTXVideo",
|
34226 |
+
"vote_type": "leftvote",
|
34227 |
+
"winner": "model_a",
|
34228 |
+
"judge": "arena_user_10.16.39.72",
|
34229 |
+
"anony": false,
|
34230 |
+
"tstamp": 1733931963.1188
|
34231 |
+
},
|
34232 |
+
{
|
34233 |
+
"model_a_conv_id": "e097a74710f64030a59527b14b410fa7",
|
34234 |
+
"model_b_conv_id": "b2857709e22446899e4671c9a1d8ccbb",
|
34235 |
+
"inputs": {
|
34236 |
+
"prompt": "broccoli on the bottom of a banana, front view"
|
34237 |
+
},
|
34238 |
+
"model_a": "Mochi1",
|
34239 |
+
"model_b": "LTXVideo",
|
34240 |
+
"vote_type": "leftvote",
|
34241 |
+
"winner": "model_a",
|
34242 |
+
"judge": "arena_user_10.16.19.14",
|
34243 |
+
"anony": false,
|
34244 |
+
"tstamp": 1733931973.3066
|
34245 |
+
},
|
34246 |
+
{
|
34247 |
+
"model_a_conv_id": "baed0a3b2cbd480d8d264af8645dc978",
|
34248 |
+
"model_b_conv_id": "d18c924ca9044c7285b5ad5a07ba8aa7",
|
34249 |
+
"inputs": {
|
34250 |
+
"prompt": "A person is washing dishes"
|
34251 |
+
},
|
34252 |
+
"model_a": "Mochi1",
|
34253 |
+
"model_b": "LTXVideo",
|
34254 |
+
"vote_type": "bothbad_vote",
|
34255 |
+
"winner": "tie (bothbad)",
|
34256 |
+
"judge": "arena_user_10.16.12.226",
|
34257 |
+
"anony": false,
|
34258 |
+
"tstamp": 1733931981.8765
|
34259 |
+
},
|
34260 |
+
{
|
34261 |
+
"model_a_conv_id": "4deaf51a08e84a64b9ecd5dc312dca7c",
|
34262 |
+
"model_b_conv_id": "13e3c1018aa04a16b59b42d4efae3ce6",
|
34263 |
+
"inputs": {
|
34264 |
+
"prompt": "a surfboard on the bottom of skis, front view"
|
34265 |
+
},
|
34266 |
+
"model_a": "Mochi1",
|
34267 |
+
"model_b": "LTXVideo",
|
34268 |
+
"vote_type": "leftvote",
|
34269 |
+
"winner": "model_a",
|
34270 |
+
"judge": "arena_user_10.16.39.72",
|
34271 |
+
"anony": false,
|
34272 |
+
"tstamp": 1733931992.3168
|
34273 |
}
|
34274 |
]
|
arena_elo/results/latest/elo_results_image_editing.pkl
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:fa959671e9c719a0a52106fd45cb9e3bbbb5dc957b919b27f78e554d4e586354
|
3 |
+
size 66042
|
arena_elo/results/latest/elo_results_t2i_generation.pkl
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:6cda80aaece5c0f36b1e8972a6aa82c8c0212ce5682d09157826283573348ae5
|
3 |
+
size 88249
|
arena_elo/results/latest/elo_results_video_generation.pkl
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 75108
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:cdba33bfac60c9a15179bcbd0c514784f74631dae0f8029b6396dd8dc10030ea
|
3 |
size 75108
|
arena_elo/results/latest/image_editing_leaderboard.csv
CHANGED
@@ -1,10 +1,11 @@
|
|
1 |
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
-
MagicBrush,MagicBrush,
|
3 |
-
InfEdit,InfEdit,
|
4 |
-
|
5 |
-
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
|
10 |
-
|
|
|
|
1 |
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
+
MagicBrush,MagicBrush,1099.0867788929968,1103.1120557156723,CC-BY-4.0,"The Ohio State University, University of Waterloo",https://osu-nlp-group.github.io/MagicBrush/
|
3 |
+
InfEdit,InfEdit,1064.570347037348,1064.3480021089454,CC BY-NC-ND 4.0,"University of Michigan, University of California, Berkeley",https://sled-group.github.io/InfEdit/
|
4 |
+
UltraEdit,UltraEdit,1060.1430521278826,1060.1858880341144,other,Peking University; BIGAI,https://ultra-editing.github.io/
|
5 |
+
CosXLEdit,CosXLEdit,1059.3947130775634,1060.33362522597,cosxl-nc-community,Stability AI,https://huggingface.co/stabilityai/cosxl
|
6 |
+
InstructPix2Pix,InstructPix2Pix,1033.0249954054239,1030.7488651531455,"Copyright 2023 Timothy Brooks, Aleksander Holynski, Alexei A. Efros","University of California, Berkeley",https://www.timothybrooks.com/instruct-pix2pix
|
7 |
+
PNP,PNP,992.2660415414323,996.7618215250773,-,Weizmann Institute of Science,https://github.com/MichalGeyer/plug-and-play
|
8 |
+
Prompt2prompt,Prompt2prompt,984.5203259478594,985.5845402588227,Apache-2.0,"Google, Tel Aviv University",https://prompt-to-prompt.github.io/
|
9 |
+
CycleDiffusion,CycleDiffusion,935.5392021913754,929.2791002660932,X11,Carnegie Mellon University,https://github.com/ChenWu98/cycle-diffusion?tab=readme-ov-file
|
10 |
+
SDEdit,SDEdit,919.3197119687734,917.7697171405332,MIT License,Stanford University,https://sde-image-editing.github.io
|
11 |
+
Pix2PixZero,Pix2PixZero,852.1348318093445,851.8763845716259,MIT License,"Carnegie Mellon University, Adobe Research",https://pix2pixzero.github.io/
|
arena_elo/results/latest/t2i_generation_leaderboard.csv
CHANGED
@@ -1,18 +1,18 @@
|
|
1 |
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
-
FLUX.1-dev,FLUX.1-dev,
|
3 |
-
PlayGround V2.5,PlayGround V2.5,1117.
|
4 |
-
FLUX.1-schnell,FLUX.1-schnell,
|
5 |
-
PlayGround V2,PlayGround V2,
|
6 |
-
Kolors,Kolors,
|
7 |
-
StableCascade,StableCascade,
|
8 |
-
HunyuanDiT,HunyuanDiT,1022.
|
9 |
-
PixArtAlpha,PixArtAlpha,
|
10 |
-
PixArtSigma,PixArtSigma,1019.
|
11 |
-
SDXL-Lightning,SDXL-Lightning,
|
12 |
-
SD3,SD3,1008.
|
13 |
-
AuraFlow,AuraFlow,
|
14 |
-
SDXL,SDXL,
|
15 |
-
SDXLTurbo,SDXLTurbo,915.
|
16 |
-
LCM(v1.5/XL),LCM(v1.5/XL),
|
17 |
-
OpenJourney,OpenJourney,833.
|
18 |
-
LCM,LCM,791.
|
|
|
1 |
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
+
FLUX.1-dev,FLUX.1-dev,1121.4486601242502,1128.197250918009,flux-1-dev-non-commercial-license (other),Black Forest Labs,https://huggingface.co/docs/diffusers/main/en/api/pipelines/flux
|
3 |
+
PlayGround V2.5,PlayGround V2.5,1117.3965302826998,1117.608362537415,Playground v2.5 Community License,Playground,https://huggingface.co/playgroundai/playground-v2.5-1024px-aesthetic
|
4 |
+
FLUX.1-schnell,FLUX.1-schnell,1091.9943954765995,1098.5515522727444,Apache-2.0,Black Forest Labs,https://huggingface.co/docs/diffusers/main/en/api/pipelines/flux
|
5 |
+
PlayGround V2,PlayGround V2,1073.4358779290842,1072.7546789340154,Playground v2 Community License,Playground,https://huggingface.co/playgroundai/playground-v2-1024px-aesthetic
|
6 |
+
Kolors,Kolors,1051.6210331030522,1050.8529262891388,Apache-2.0,Kwai Kolors,https://huggingface.co/Kwai-Kolors/Kolors
|
7 |
+
StableCascade,StableCascade,1041.4165535435802,1044.1606535376802,stable-cascade-nc-community (other),Stability AI,https://fal.ai/models/stable-cascade/api
|
8 |
+
HunyuanDiT,HunyuanDiT,1022.9312260549116,1016.9352728465166,tencent-hunyuan-community,Tencent,https://github.com/Tencent/HunyuanDiT
|
9 |
+
PixArtAlpha,PixArtAlpha,1020.1155281805594,1012.2359305243641,openrail++,PixArt-alpha,https://huggingface.co/PixArt-alpha/PixArt-XL-2-1024-MS
|
10 |
+
PixArtSigma,PixArtSigma,1019.1321764438638,1019.3947143326678,openrail++,PixArt-alpha,https://github.com/PixArt-alpha/PixArt-sigma
|
11 |
+
SDXL-Lightning,SDXL-Lightning,1018.9162837302384,1023.1896068510646,openrail++,ByteDance,https://huggingface.co/ByteDance/SDXL-Lightning
|
12 |
+
SD3,SD3,1008.3923555357085,1011.2183103003182,stabilityai-nc-research-community,Stability AI,https://huggingface.co/blog/sd3
|
13 |
+
AuraFlow,AuraFlow,997.728802839903,994.0520977829948,Apache-2.0,Fal.AI,https://huggingface.co/fal/AuraFlow
|
14 |
+
SDXL,SDXL,968.6355564577761,969.2539822246374,openrail++,Stability AI,https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0
|
15 |
+
SDXLTurbo,SDXLTurbo,915.5256581329278,913.7937287705809,sai-nc-community (other),Stability AI,https://huggingface.co/stabilityai/sdxl-turbo
|
16 |
+
LCM(v1.5/XL),LCM(v1.5/XL),906.202164093364,900.2717108462446,openrail++,Latent Consistency,https://fal.ai/models/fast-lcm-diffusion-turbo
|
17 |
+
OpenJourney,OpenJourney,833.4517296135969,828.4807395699017,creativeml-openrail-m,PromptHero,https://huggingface.co/prompthero/openjourney
|
18 |
+
LCM,LCM,791.6554684578832,806.0704110447675,MIT License,Tsinghua University,https://huggingface.co/SimianLuo/LCM_Dreamshaper_v7
|
arena_elo/results/latest/video_generation_leaderboard.csv
CHANGED
@@ -1,14 +1,14 @@
|
|
1 |
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
-
CogVideoX-5B,CogVideoX-5B,
|
3 |
-
Pyramid Flow,Pyramid Flow,1142.
|
4 |
-
StableVideoDiffusion,StableVideoDiffusion,1123.
|
5 |
-
CogVideoX-2B,CogVideoX-2B,1068.
|
6 |
-
T2V-Turbo,T2V-Turbo,1054.
|
7 |
-
AnimateDiff,AnimateDiff,1041.
|
8 |
-
VideoCrafter2,VideoCrafter2,1038.
|
9 |
-
Allegro,Allegro,1011.
|
10 |
-
LaVie,LaVie,968.
|
11 |
-
OpenSora,OpenSora,884.
|
12 |
-
OpenSora v1.2,OpenSora v1.2,850.
|
13 |
-
AnimateDiff Turbo,AnimateDiff Turbo,835.
|
14 |
-
ModelScope,ModelScope,834.
|
|
|
1 |
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
+
CogVideoX-5B,CogVideoX-5B,1149.0467990559753,1143.8489720202278,CogVideoX LICENSE,THUDM,https://github.com/THUDM/CogVideo
|
3 |
+
Pyramid Flow,Pyramid Flow,1142.2012500842695,1144.6306640141165,MIT LICENSE,Peking University,https://pyramid-flow.github.io/
|
4 |
+
StableVideoDiffusion,StableVideoDiffusion,1123.459174455383,1125.9300224510548,SVD-nc-community,Stability AI,https://fal.ai/models/fal-ai/fast-svd/text-to-video/api
|
5 |
+
CogVideoX-2B,CogVideoX-2B,1068.3442892756807,1064.5113173896357,CogVideoX LICENSE,THUDM,https://github.com/THUDM/CogVideo
|
6 |
+
T2V-Turbo,T2V-Turbo,1054.0619737597983,1054.052143493999,cc-by-nc-4.0,"University of California, Santa Barbara",https://github.com/Ji4chenLi/t2v-turbo
|
7 |
+
AnimateDiff,AnimateDiff,1041.1138517372308,1039.9705875487114,creativeml-openrail-m,"The Chinese University of Hong Kong, Shanghai AI Lab, Stanford University",https://fal.ai/models/fast-animatediff-t2v
|
8 |
+
VideoCrafter2,VideoCrafter2,1038.6609035771987,1039.442228627356,Apache 2.0,Tencent AI Lab,https://ailab-cvc.github.io/videocrafter2/
|
9 |
+
Allegro,Allegro,1011.62764027565,1014.1555579516963,Apache 2.0,rhymes-ai,https://github.com/rhymes-ai/Allegro
|
10 |
+
LaVie,LaVie,968.0112777795196,968.72992940921,Apache 2.0,Shanghai AI Lab,https://github.com/Vchitect/LaVie
|
11 |
+
OpenSora,OpenSora,884.2192842497221,884.717192070073,Apache 2.0,HPC-AI Tech,https://github.com/hpcaitech/Open-Sora
|
12 |
+
OpenSora v1.2,OpenSora v1.2,850.0337198093126,847.8764663237778,Apache 2.0,HPC-AI Tech,https://github.com/hpcaitech/Open-Sora
|
13 |
+
AnimateDiff Turbo,AnimateDiff Turbo,835.1752634927545,836.0388913181373,creativeml-openrail-m,"The Chinese University of Hong Kong, Shanghai AI Lab, Stanford University",https://fal.ai/models/fast-animatediff-t2v-turbo
|
14 |
+
ModelScope,ModelScope,834.0445724475059,836.0960273820054,cc-by-nc-4.0,Alibaba Group,https://arxiv.org/abs/2308.06571
|