Support this model architecture for optimum onnx export
#19
by
AshokRaja
- opened
When trying to export this model to ONNX
from sentence_transformers import SentenceTransformer, export_optimized_onnx_model, CrossEncoder
model_kwargs = {
"provider" : "CUDAExecutionProvider",
"export" :True
}
model_path = "/data/nomic-embed-text-v2-moe"
model = SentenceTransformer(model_name_or_path=model_path, backend="onnx", model_kwargs=model_kwargs, trust_remote_code=True)
export_optimized_onnx_model(model, optimization_config="O4", model_name_or_path=model_path)
Facing this issue:
raise ValueError(
ValueError: Trying to export a nomic-bert model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as 'custom_onnx_configs'.
Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models.
Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type nomic-bert to be supported natively in the ONNX export.```
@tomaarsen @Xenova I don't have a ton of experience creating an onnx model from SentenceTransformers, what's the best way for a custom model?