Spaces:
Sleeping
Sleeping
Jean-Baptiste dlb
commited on
Update openai.md (#1730)
Browse filesI was able to set multimodal: 'true' when using a open ai backend running Qwen2-VL in a vllm openai like backend
docs/source/configuration/models/providers/openai.md
CHANGED
@@ -3,7 +3,7 @@
|
|
3 |
| Feature | Available |
|
4 |
| --------------------------- | --------- |
|
5 |
| [Tools](../tools) | No |
|
6 |
-
| [Multimodal](../multimodal) |
|
7 |
|
8 |
Chat UI can be used with any API server that supports OpenAI API compatibility, for example [text-generation-webui](https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai), [LocalAI](https://github.com/go-skynet/LocalAI), [FastChat](https://github.com/lm-sys/FastChat/blob/main/docs/openai_api.md), [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), and [ialacol](https://github.com/chenhunghan/ialacol) and [vllm](https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html).
|
9 |
|
|
|
3 |
| Feature | Available |
|
4 |
| --------------------------- | --------- |
|
5 |
| [Tools](../tools) | No |
|
6 |
+
| [Multimodal](../multimodal) | Yes |
|
7 |
|
8 |
Chat UI can be used with any API server that supports OpenAI API compatibility, for example [text-generation-webui](https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai), [LocalAI](https://github.com/go-skynet/LocalAI), [FastChat](https://github.com/lm-sys/FastChat/blob/main/docs/openai_api.md), [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), and [ialacol](https://github.com/chenhunghan/ialacol) and [vllm](https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html).
|
9 |
|