Support this model architecture for optimum onnx export

#19
by AshokRaja - opened

When trying to export this model to ONNX

from sentence_transformers import SentenceTransformer, export_optimized_onnx_model, CrossEncoder

model_kwargs = {
    "provider" : "CUDAExecutionProvider",
    "export" :True
    }
model_path = "/data/nomic-embed-text-v2-moe"
model = SentenceTransformer(model_name_or_path=model_path, backend="onnx", model_kwargs=model_kwargs, trust_remote_code=True)
export_optimized_onnx_model(model, optimization_config="O4", model_name_or_path=model_path)

Facing this issue:

raise ValueError(
ValueError: Trying to export a nomic-bert model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as 'custom_onnx_configs'. 
Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. 
Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type nomic-bert to be supported natively in the ONNX export.```
Nomic AI org

@tomaarsen @Xenova I don't have a ton of experience creating an onnx model from SentenceTransformers, what's the best way for a custom model?

Bump

We could ask @zpn who pushed the ONNX version for v1.5

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment