Updated metrics
Browse filesUpdated model metrics
README.md
CHANGED
@@ -23,19 +23,19 @@ model-index:
|
|
23 |
type: unknown
|
24 |
metrics:
|
25 |
- type: cosine_accuracy
|
26 |
-
value:
|
27 |
name: Cosine Accuracy
|
28 |
- type: cosine_f1
|
29 |
-
value:
|
30 |
name: Cosine F1
|
31 |
- type: cosine_precision
|
32 |
-
value:
|
33 |
name: Cosine Precision
|
34 |
- type: cosine_recall
|
35 |
-
value:
|
36 |
name: Cosine Recall
|
37 |
- type: cosine_ap
|
38 |
-
value:
|
39 |
name: Cosine Ap
|
40 |
---
|
41 |
|
@@ -106,11 +106,11 @@ print(similarities.shape)
|
|
106 |
|
107 |
| Metric | Value |
|
108 |
|:--------------------------|:----------|
|
109 |
-
| cosine_accuracy |
|
110 |
-
| cosine_f1 |
|
111 |
-
| cosine_precision |
|
112 |
-
| cosine_recall |
|
113 |
-
| **cosine_ap** | |
|
114 |
|
115 |
|
116 |
### Training Dataset
|
@@ -118,7 +118,7 @@ print(similarities.shape)
|
|
118 |
#### Quora
|
119 |
|
120 |
* Dataset: [Quora](https://www.kaggle.com/datasets/quora/question-pairs-dataset)
|
121 |
-
* Size:
|
122 |
* Columns: <code>question_1</code>, <code>question_2</code>, and <code>label</code>
|
123 |
|
124 |
### Evaluation Dataset
|
@@ -126,7 +126,7 @@ print(similarities.shape)
|
|
126 |
#### Quora
|
127 |
|
128 |
* Dataset: [Quora](https://www.kaggle.com/datasets/quora/question-pairs-dataset)
|
129 |
-
* Size: evaluation samples
|
130 |
* Columns: <code>question_1</code>, <code>question_2</code>, and <code>label</code>
|
131 |
|
132 |
## Citation
|
|
|
23 |
type: unknown
|
24 |
metrics:
|
25 |
- type: cosine_accuracy
|
26 |
+
value: 0.90
|
27 |
name: Cosine Accuracy
|
28 |
- type: cosine_f1
|
29 |
+
value: 0.87
|
30 |
name: Cosine F1
|
31 |
- type: cosine_precision
|
32 |
+
value: 0.84
|
33 |
name: Cosine Precision
|
34 |
- type: cosine_recall
|
35 |
+
value: 0.90
|
36 |
name: Cosine Recall
|
37 |
- type: cosine_ap
|
38 |
+
value: 0.92
|
39 |
name: Cosine Ap
|
40 |
---
|
41 |
|
|
|
106 |
|
107 |
| Metric | Value |
|
108 |
|:--------------------------|:----------|
|
109 |
+
| cosine_accuracy | 0.90 |
|
110 |
+
| cosine_f1 | 0.87 |
|
111 |
+
| cosine_precision | 0.84 |
|
112 |
+
| cosine_recall | 0.90 |
|
113 |
+
| **cosine_ap** | 0.92 |
|
114 |
|
115 |
|
116 |
### Training Dataset
|
|
|
118 |
#### Quora
|
119 |
|
120 |
* Dataset: [Quora](https://www.kaggle.com/datasets/quora/question-pairs-dataset)
|
121 |
+
* Size: 323491 training samples
|
122 |
* Columns: <code>question_1</code>, <code>question_2</code>, and <code>label</code>
|
123 |
|
124 |
### Evaluation Dataset
|
|
|
126 |
#### Quora
|
127 |
|
128 |
* Dataset: [Quora](https://www.kaggle.com/datasets/quora/question-pairs-dataset)
|
129 |
+
* Size: 53486 evaluation samples
|
130 |
* Columns: <code>question_1</code>, <code>question_2</code>, and <code>label</code>
|
131 |
|
132 |
## Citation
|