Skip to content

asseukihuh/AI-WebApp

Repository files navigation

Turn on your local ai-webapp using Ollama API ! 🤖

⚠️ This is currently for experimental purpose, it can do inappropriate things at times ⚠️

📌 Summary

Computer specifications
Functionnalities
Install on Linux
Install on Windows

💻 Specs

Minimal specs :

Operating System: Linux, Mac or Windows
Memory (RAM): 8GB
Processor: A relatively modern CPU (5 years, 4 cores)
GPU: Integrated GPU works but runs slow

Recommended specs :

Operating System: Linux, Mac or Windows
Memory (RAM): 16GB
Processor: A relatively modern CPU (5 years, 8 cores)
GPU: Dedicated GPU 6GB VRAM minimal (CUDA is the best)

📋 Functionnalities

  • Respond to a prompt with a continuous context (may me innapropriate because of the model)
  • Change model inside the webapp

Configuration :

🐧 Linux

1. Install ollama

curl -fsSL https://ollama.com/install.sh | sh

2. Run ollama

ollama serve

3. Choose a model and make sure it runs

You can find models at ollama.com/search.

Once you found your model run it and test some prompts to make sure it runs.

ollama run <model name>

4. Clone this repository in your computer

git clone https://github.com/asseukihuh/ai-webapp

5. Host a server in your computer

python -m http.server 8000

6. Test your local ai-webapp

Go to the adress localhost:8000 on your navigator.

Find the repository where index.html is located.

And there is your local ai-webapp.

🪟 Windows

1. Install ollama

Install ollama via ollama.com

2. Run ollama

ollama serve

3. Choose a model and make sure it runs

You can find models at ollama.com/search.

Once you found your model run it and test some prompts to make sure it runs.

ollama run <model name>

4. Clone this repository in your computer

git clone https://github.com/asseukihuh/ai-webapp

5. Host a server in your computer

python -m http.server 8000

6. Test your local ai-webapp

Go to the adress localhost:8000 on your navigator. (you can chose the path where the localhost is via --directory "")

Find the repository where index.html is located.

And there is your local ai-webapp.

About

Local ai webapp using Ollama, ComfyUi.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published