vacaigent / README.md
Benjamin Consolvo
init1
78797ac
---
title: VacAIgent
emoji: 🐨
colorFrom: yellow
colorTo: purple
sdk: streamlit
sdk_version: 1.44.1
app_file: app.py
pinned: false
license: apache-2.0
short_description: Let AI agents plan your next vacation!
---
# πŸ–οΈ VacAIgent: Streamlit-Integrated AI Crew for Trip Planning
_Forked and enhanced from the_ [_crewAI examples repository_](https://github.com/joaomdmoura/crewAI-examples/tree/main/trip_planner)
## Introduction
VacAIgent leverages the CrewAI framework to automate and enhance the trip planning experience, integrating a user-friendly Streamlit interface. This project demonstrates how autonomous AI agents can collaborate and execute complex tasks efficiently, now with an added layer of interactivity and accessibility through Streamlit.
**Check out the video below for code walkthrough** πŸ‘‡
<a href="https://youtu.be/nKG_kbQUDDE">
<img src="https://img.youtube.com/vi/nKG_kbQUDDE/hqdefault.jpg" alt="Watch the video" width="100%">
</a>
(_Trip example originally developed by [@joaomdmoura](https://x.com/joaomdmoura)_)
## CrewAI Framework
CrewAI simplifies the orchestration of role-playing AI agents. In VacAIgent, these agents collaboratively decide on cities and craft a complete itinerary for your trip based on specified preferences, all accessible via a streamlined Streamlit user interface.
## Streamlit Interface
The introduction of [Streamlit](https://streamlit.io/) transforms this application into an interactive web app, allowing users to easily input their preferences and receive tailored travel plans.
## Running the Application
To experience the VacAIgent app:
### Pre-Requisites
1. Install and Configure **git** on your machine
2. Get the API key from **scrapinagent.com** from scrapinagent [Click Here to Signup](https://scrapingant.com/)
3. Get the API from **SERPER API** from serper [Click here to Signup]( https://serper.dev/)
### Deploy Trip Planner
#### Step 1
```sh
git clone https://github.com/intel-sandbox/trip_planner_agent
```
* *Please make sure git is installed*
#### Step 2
Insall Dependencies
```sh
pip install -r requirements.txt
```
#### Step 3
```sh
cd trip_planner_agent
```
create `.streamlit/secrets.toml` file and Update **Credentials**
you can use secrtes.example file for reference
```sh
SERPER_API_KEY=""
SCRAPINGANT_API_KEY=""
OPENAI_API_KEY=""
MODEL_ID=""
MODEL_BASE_URL=""
```
#### Step 4
Run the application
```sh
streamlit run streamlit_app.py
```
Your application should be up and running
β˜… **Disclaimer**: The application uses GPT-4 by default. Ensure you have access to OpenAI's API and be aware of the associated costs.
## Details & Explanation
- **Streamlit UI**: The Streamlit interface is implemented in `streamlit_app.py`, where users can input their trip details.
- **Components**:
- `./trip_tasks.py`: Contains task prompts for the agents.
- `./trip_agents.py`: Manages the creation of agents.
- `./tools directory`: Houses tool classes used by agents.
- `./streamlit_app.py`: The heart of the Streamlit app.
## Using GPT 3.5
To switch from GPT-4 to GPT-3.5, pass the llm argument in the agent constructor:
```python
from langchain.chat_models import ChatOpenAI
llm = ChatOpenAI(model='gpt-3.5-turbo') # Loading gpt-3.5-turbo (see more OpenAI models at https://platform.openai.com/docs/models/gpt-4-turbo-and-gpt-4)
class TripAgents:
# ... existing methods
def local_expert(self):
return Agent(
role='Local Expert',
goal='Provide insights about the selected city',
tools=[SearchTools.search_internet, BrowserTools.scrape_and_summarize_website],
llm=llm,
verbose=True
)
```
## Using Local Models with Ollama
For enhanced privacy and customization, you can integrate local models like Ollama:
### Setting Up Ollama
- **Installation**: Follow Ollama's guide for installation.
- **Configuration**: Customize the model as per your requirements.
### Integrating Ollama with CrewAI
Pass the Ollama model to agents in the CrewAI framework:
```python
from langchain.llms import Ollama
ollama_model = Ollama(model="agent")
class TripAgents:
# ... existing methods
def local_expert(self):
return Agent(
role='Local Expert',
tools=[SearchTools.search_internet, BrowserTools.scrape_and_summarize_website],
llm=ollama_model,
verbose=True
)
```
## Benefits of Local Models
- **Privacy**: Process sensitive data in-house.
- **Customization**: Tailor models to fit specific needs.
- **Performance**: Potentially faster responses with on-premises models.
## License
VacAIgent is open-sourced under the MIT License.
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference