This project is a chatbot powered by the Deepseek R1 model, built using Streamlit. It leverages the Ollama API to process user inputs and generate real-time responses. 🚀
- ✅ Interactive chatbot with Streamlit
- ✅ AI responses using the Deepseek R1 model
- ✅ Local execution with Ollama API
- ✅ Real-time conversation support
First, ensure you have Python and Git installed. Then, create a virtual environment and install the required dependencies:
# Create a virtual environment (optional)
python -m venv myenv
source myenv/bin/activate # macOS/Linux
myenv\Scripts\activate # Windows
# Install required packages
pip install streamlit requests
Make sure you have Ollama installed. You can install it using:
# Install Ollama (if not installed)
pip install ollama # Windows
winget install ollama # Windows
brew install ollama # macOS
Now, download and run the Deepseek R1 model:
ollama run deepseek-r1
This will start the Ollama API at http://localhost:11434.
Now, you can start the Streamlit application:
streamlit run app.py
Once the server starts, open the generated URL in your browser and start chatting! 🎉
- Python 🐍
- Streamlit 🖥️
- Ollama API 🌍
- Deepseek R1 Model 🤖
If you’d like to contribute:
- Fork the repository 🍴
- Add a feature or fix a bug 🛠️
- Submit a Pull Request (PR)! 🚀
This project is licensed under the MIT License. 🎓
After creating the README.md
file, you can add it to your GitHub repo using:
git add README.md
git commit -m "Added README file 📜"
git push origin main