Merge pull request #17 from dexhunter/main
Browse files:memo: Update README to include local llm usage
README.md
CHANGED
@@ -82,6 +82,29 @@ To further customize the behaviour of AIDE, some useful options might be:
|
|
82 |
|
83 |
You can check the [`config.yaml`](aide/utils/config.yaml) file for more options.
|
84 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
85 |
## Using AIDE in Python
|
86 |
|
87 |
Using AIDE within your Python script/project is easy. Follow the setup steps above, and then create an AIDE experiment like below and start running:
|
|
|
82 |
|
83 |
You can check the [`config.yaml`](aide/utils/config.yaml) file for more options.
|
84 |
|
85 |
+
### Using Local LLMs
|
86 |
+
|
87 |
+
AIDE supports using local LLMs through OpenAI-compatible APIs. Here's how to set it up:
|
88 |
+
|
89 |
+
1. Set up a local LLM server with an OpenAI-compatible API endpoint. You can use:
|
90 |
+
- [Ollama](https://github.com/ollama/ollama)
|
91 |
+
- or similar solutions
|
92 |
+
|
93 |
+
2. Configure your environment to use the local endpoint:
|
94 |
+
```bash
|
95 |
+
export OPENAI_BASE_URL="http://localhost:11434/v1" # For Ollama
|
96 |
+
export OPENAI_API_KEY="local-llm" # Can be any string if your local server doesn't require authentication
|
97 |
+
```
|
98 |
+
|
99 |
+
3. Update the model configuration in your AIDE command or config. For example, with Ollama:
|
100 |
+
```bash
|
101 |
+
# Example with house prices dataset
|
102 |
+
aide agent.code.model="qwen2.5" agent.feedback.model="qwen2.5" report.model="qwen2.5" \
|
103 |
+
data_dir="example_tasks/house_prices" \
|
104 |
+
goal="Predict the sales price for each house" \
|
105 |
+
eval="Use the RMSE metric between the logarithm of the predicted and observed values."
|
106 |
+
```
|
107 |
+
|
108 |
## Using AIDE in Python
|
109 |
|
110 |
Using AIDE within your Python script/project is easy. Follow the setup steps above, and then create an AIDE experiment like below and start running:
|