Update README.md
Browse files
README.md
CHANGED
@@ -43,7 +43,7 @@ We provide Ruri-v3 in several model sizes. Below is a summary of each model.
|
|
43 |
You can use our models directly with the transformers library v4.48.0 or higher:
|
44 |
|
45 |
```bash
|
46 |
-
pip install -U "transformers>=4.48.0"
|
47 |
```
|
48 |
|
49 |
Additionally, if your GPUs support Flash Attention 2, we recommend using our models with Flash Attention 2.
|
@@ -100,6 +100,7 @@ Evaluated with [JMTEB](https://github.com/sbintuitions/JMTEB).
|
|
100 |
|[**Ruri-v3-310m**](https://huggingface.co/cl-nagoya/ruri-v3-310m)<br/>(this model)|**315M**|**77.24**|81.89|81.22|78.66|93.43|55.69|62.60|
|
101 |
||||||||||
|
102 |
|[sbintuitions/sarashina-embedding-v1-1b](https://huggingface.co/sbintuitions/sarashina-embedding-v1-1b)|1.22B|75.50|77.61|82.71|78.37|93.74|53.86|62.00|
|
|
|
103 |
||||||||||
|
104 |
|OpenAI/text-embedding-ada-002|-|69.48|64.38|79.02|69.75|93.04|48.30|62.40|
|
105 |
|OpenAI/text-embedding-3-small|-|70.86|66.39|79.46|73.06|92.92|51.06|62.27|
|
|
|
43 |
You can use our models directly with the transformers library v4.48.0 or higher:
|
44 |
|
45 |
```bash
|
46 |
+
pip install -U "transformers>=4.48.0" sentence-transformers
|
47 |
```
|
48 |
|
49 |
Additionally, if your GPUs support Flash Attention 2, we recommend using our models with Flash Attention 2.
|
|
|
100 |
|[**Ruri-v3-310m**](https://huggingface.co/cl-nagoya/ruri-v3-310m)<br/>(this model)|**315M**|**77.24**|81.89|81.22|78.66|93.43|55.69|62.60|
|
101 |
||||||||||
|
102 |
|[sbintuitions/sarashina-embedding-v1-1b](https://huggingface.co/sbintuitions/sarashina-embedding-v1-1b)|1.22B|75.50|77.61|82.71|78.37|93.74|53.86|62.00|
|
103 |
+
|[PLaMo-Embedding-1B](https://huggingface.co/pfnet/plamo-embedding-1b)|1.05B|76.10|79.94|83.14|77.20|93.57|53.47|62.37|
|
104 |
||||||||||
|
105 |
|OpenAI/text-embedding-ada-002|-|69.48|64.38|79.02|69.75|93.04|48.30|62.40|
|
106 |
|OpenAI/text-embedding-3-small|-|70.86|66.39|79.46|73.06|92.92|51.06|62.27|
|