fix: fix Script path in readme
Browse files
README.md
CHANGED
@@ -239,10 +239,10 @@ OWL supports various LLM backends. You can use the following scripts to run with
|
|
239 |
|
240 |
```bash
|
241 |
# Run with Qwen model
|
242 |
-
python owl/
|
243 |
|
244 |
# Run with Deepseek model
|
245 |
-
python owl/
|
246 |
|
247 |
# Run with other OpenAI-compatible models
|
248 |
python owl/run_openai_compatiable_model.py
|
@@ -251,7 +251,7 @@ python owl/run_openai_compatiable_model.py
|
|
251 |
For a simpler version that only requires an LLM API key, you can try our minimal example:
|
252 |
|
253 |
```bash
|
254 |
-
python owl/
|
255 |
```
|
256 |
|
257 |
You can run OWL agent with your own task by modifying the `run.py` script:
|
|
|
239 |
|
240 |
```bash
|
241 |
# Run with Qwen model
|
242 |
+
python owl/run_qwen_zh.py
|
243 |
|
244 |
# Run with Deepseek model
|
245 |
+
python owl/run_deepseek_zh.py
|
246 |
|
247 |
# Run with other OpenAI-compatible models
|
248 |
python owl/run_openai_compatiable_model.py
|
|
|
251 |
For a simpler version that only requires an LLM API key, you can try our minimal example:
|
252 |
|
253 |
```bash
|
254 |
+
python owl/run_mini_zh.py
|
255 |
```
|
256 |
|
257 |
You can run OWL agent with your own task by modifying the `run.py` script:
|