Jean-Baptiste dlb commited on
Commit
9e4fa41
·
unverified ·
1 Parent(s): a7560a6

Update openai.md (#1730)

Browse files

I was able to set multimodal: 'true' when using a open ai backend running Qwen2-VL in a vllm openai like backend

docs/source/configuration/models/providers/openai.md CHANGED
@@ -3,7 +3,7 @@
3
  | Feature | Available |
4
  | --------------------------- | --------- |
5
  | [Tools](../tools) | No |
6
- | [Multimodal](../multimodal) | No |
7
 
8
  Chat UI can be used with any API server that supports OpenAI API compatibility, for example [text-generation-webui](https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai), [LocalAI](https://github.com/go-skynet/LocalAI), [FastChat](https://github.com/lm-sys/FastChat/blob/main/docs/openai_api.md), [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), and [ialacol](https://github.com/chenhunghan/ialacol) and [vllm](https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html).
9
 
 
3
  | Feature | Available |
4
  | --------------------------- | --------- |
5
  | [Tools](../tools) | No |
6
+ | [Multimodal](../multimodal) | Yes |
7
 
8
  Chat UI can be used with any API server that supports OpenAI API compatibility, for example [text-generation-webui](https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai), [LocalAI](https://github.com/go-skynet/LocalAI), [FastChat](https://github.com/lm-sys/FastChat/blob/main/docs/openai_api.md), [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), and [ialacol](https://github.com/chenhunghan/ialacol) and [vllm](https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html).
9