DeepSite locally with Ollama

#96
by martinsmessias - opened

I created a custom version of DeepSite to run locally!

Now you can run the powerful DeepSite platform directly on your own machine โ€” fully customizable, and with no need for external services. ๐ŸŒŸ
Using Ollama, you can seamlessly integrate any AI model (Llama 2, Mistral, DeepSeek, etc.) into your setup, giving you full control over your environment and workflow.

localconfig.png

Check out the project on GitHub: https://github.com/MartinsMessias/deepsite-locally

martinsmessias changed discussion title from DeepSite with Ollama to DeepSite locally with Ollama
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment