Spaces:
Running
on
CPU Upgrade
Running
on
CPU Upgrade
File size: 2,913 Bytes
b88a126 5559ab7 b88a126 1c84463 dffaafd 1c84463 dffaafd 1c84463 dffaafd 1c84463 b88a126 1c84463 dffaafd 1c84463 b88a126 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 |
# Google
| Feature | Available |
| --------------------------- | --------- |
| [Tools](../tools) | No |
| [Multimodal](../multimodal) | No |
Chat UI can connect to the google Vertex API endpoints ([List of supported models](https://cloud.google.com/vertex-ai/generative-ai/docs/learn/models)).
To enable:
1. [Select](https://console.cloud.google.com/project) or [create](https://cloud.google.com/resource-manager/docs/creating-managing-projects#creating_a_project) a Google Cloud project.
1. [Enable billing for your project](https://cloud.google.com/billing/docs/how-to/modify-project).
1. [Enable the Vertex AI API](https://console.cloud.google.com/flows/enableapi?apiid=aiplatform.googleapis.com).
1. [Set up authentication with a service account](https://cloud.google.com/docs/authentication/getting-started)
so you can access the API from your local workstation.
The service account credentials file can be imported as an environmental variable:
```ini
GOOGLE_APPLICATION_CREDENTIALS = clientid.json
```
Make sure your docker container has access to the file and the variable is correctly set.
Afterwards Google Vertex endpoints can be configured as following:
```ini
MODELS=`[
{
"name": "gemini-1.5-pro",
"displayName": "Vertex Gemini Pro 1.5",
"endpoints" : [{
"type": "vertex",
"project": "abc-xyz",
"location": "europe-west3",
"extraBody": {
"model_version": "gemini-1.5-pro-002",
},
// Optional
"safetyThreshold": "BLOCK_MEDIUM_AND_ABOVE",
"apiEndpoint": "", // alternative api endpoint url,
"tools": [{
"googleSearchRetrieval": {
"disableAttribution": true
}
}]
}]
}
]`
```
## GenAI
Or use the Gemini API API provider [from](https://github.com/google-gemini/generative-ai-js#readme):
Make sure that you have an API key from Google Cloud Platform. To get an API key, follow the instructions [here](https://ai.google.dev/gemini-api/docs/api-key).
You can either specify them directly in your `.env.local` using the `GOOGLE_GENAI_API_KEY` variables, or you can set them directly in the endpoint config.
You can find the list of models available [here](https://ai.google.dev/gemini-api/docs/models/gemini), and experimental models available [here](https://ai.google.dev/gemini-api/docs/models/experimental-models).
```ini
MODELS=`[
{
"name": "gemini-1.5-flash",
"displayName": "Gemini Flash 1.5",
"multimodal": true,
"endpoints": [
{
"type": "genai",
// Optional
"apiKey": "abc...xyz"
"safetyThreshold": "BLOCK_MEDIUM_AND_ABOVE",
}
]
},
{
"name": "gemini-1.5-pro",
"displayName": "Gemini Pro 1.5",
"multimodal": false,
"endpoints": [
{
"type": "genai",
// Optional
"apiKey": "abc...xyz"
}
]
}
]`
```
|