Datasets:
Update README.md
Browse files
README.md
CHANGED
@@ -72,6 +72,27 @@ for item in ds["test"]:
|
|
72 |
print([prompt, img]) # Replace with your processing logic
|
73 |
```
|
74 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
75 |
## Citation
|
76 |
|
77 |
If you use this dataset, please cite the original paper:
|
|
|
72 |
print([prompt, img]) # Replace with your processing logic
|
73 |
```
|
74 |
|
75 |
+
## Leaderboard
|
76 |
+
|
77 |
+
| Model | Overall | Place Info | Nearby | Routing | Counting | Unanswerable |
|
78 |
+
|---------------------------|:-------:|:----------:|:------:|:-------:|:--------:|:------------:|
|
79 |
+
| Claude-3.5-Sonnet | **61.65** | **82.64** | 55.56 | **45.00** | **47.73** | **90.00** |
|
80 |
+
| GPT-4o | 58.90 | 76.86 | **57.78** | 50.00 | **47.73** | 40.00 |
|
81 |
+
| Gemini-1.5-Pro | 56.14 | 76.86 | 56.67 | 43.75 | 32.95 | 80.00 |
|
82 |
+
| GPT-4-Turbo | 55.89 | 75.21 | 56.67 | 42.50 | 44.32 | 40.00 |
|
83 |
+
| Gemini-1.5-Flash | 51.94 | 70.25 | 56.47 | 38.36 | 32.95 | 55.00 |
|
84 |
+
| GPT-4o-mini | 50.13 | 77.69 | 47.78 | 41.25 | 28.41 | 25.00 |
|
85 |
+
| Qwen2-VL-7B-Instruct | 51.63 | 71.07 | 48.89 | 40.00 | 40.91 | 40.00 |
|
86 |
+
| Glm-4v-9b | 48.12 | 73.55 | 42.22 | 41.25 | 34.09 | 10.00 |
|
87 |
+
| InternLm-Xcomposer2 | 43.11 | 70.41 | 48.89 | 43.75 | 34.09 | 10.00 |
|
88 |
+
| MiniCPM-Llama3-V-2.5 | 40.60 | 60.33 | 32.22 | 32.50 | 31.82 | 30.00 |
|
89 |
+
| Llama-3-VILA1.5-8B | 32.99 | 46.90 | 32.22 | 28.75 | 26.14 | 5.00 |
|
90 |
+
| DocOwl1.5 | 31.08 | 43.80 | 23.33 | 32.50 | 27.27 | 0.00 |
|
91 |
+
| Llava-v1.6-Mistral-7B-hf | 31.33 | 42.15 | 28.89 | 32.50 | 21.59 | 15.00 |
|
92 |
+
| Paligemma-3B-mix-224 | 30.58 | 37.19 | 25.56 | 38.75 | 23.86 | 10.00 |
|
93 |
+
| Llava-1.5-7B-hf | 20.05 | 22.31 | 18.89 | 13.75 | 28.41 | 0.00 |
|
94 |
+
| Human | 82.23 | 81.67 | 82.42 | 85.18 | 78.41 | 65.00 |
|
95 |
+
|
96 |
## Citation
|
97 |
|
98 |
If you use this dataset, please cite the original paper:
|