Feature Extraction
Transformers
Safetensors
ModularStarEncoder
custom_code
andreagurioli1995 commited on
Commit
63af01a
·
verified ·
1 Parent(s): 0c0d4bb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -12,7 +12,7 @@ base_model:
12
  <!-- Provide a quick summary of what the model is/does. -->
13
 
14
  ModularStarEncoder-finetuned is an encoder built on top of [ModularStarEncoder-1B Pre-trained](https://huggingface.co/andreagurioli1995/ModularStarEncoder) on [SynthCode2Code2NL](add link here).
15
- ModularStarEncoder fine-tuned encoder for various retrieval tasks, enabling the end user to select the model size that meets their memory and computational constraints.
16
  We built ModularStarEncoder on top of [StarCoder-2](https://huggingface.co/bigcode/starcoder2-15b), reducing its size from 15B to 1B parameters in bfloat16.
17
 
18
  The model is finetuned with [CLIP objective](https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/loss.py)
 
12
  <!-- Provide a quick summary of what the model is/does. -->
13
 
14
  ModularStarEncoder-finetuned is an encoder built on top of [ModularStarEncoder-1B Pre-trained](https://huggingface.co/andreagurioli1995/ModularStarEncoder) on [SynthCode2Code2NL](add link here).
15
+ ModularStarEncoder fine-tuned is an encoder for various retrieval tasks, enabling the end user to select the model size that meets their memory and computational constraints.
16
  We built ModularStarEncoder on top of [StarCoder-2](https://huggingface.co/bigcode/starcoder2-15b), reducing its size from 15B to 1B parameters in bfloat16.
17
 
18
  The model is finetuned with [CLIP objective](https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/loss.py)