Skip to content

LocalAPI.AI is a local AI management tool for Ollama, offering Web UI management and compatibility with vLLM, LM Studio, llama.cpp, Mozilla-Llamafile, Jan Al, Cortex API, Local-LLM, LiteLLM, GPT4All, and more.

Notifications You must be signed in to change notification settings

vam876/LocalAPI.AI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

21 Commits
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

LocalAPI.ai

GitHub stars GitHub forks GitHub watchers GitHub repo size GitHub language count GitHub top language GitHub last commit


δΈ­ζ–‡


LocalAPI.AI πŸ€–

Tool Overview 🌟

LocalAPI.AI is a dedicated local AI management tool specifically designed for Ollama, and it is also compatible with a variety of mainstream local AI deployment platforms such as vLLM, LM Studio, and llama.cpp.
It integrates intelligent conversation, text generation, multimodal image recognition, and other functions, and provides comprehensive model management support, including advanced functions such as model copying, deletion, pulling, updating, creation, and model quantization, as well as flexible parameter settings to meet the needs of different users.


Features πŸš€

β€’ Designed for Ollama and Compatible with Multiple Models
Deeply integrated with the core functions of Ollama, it is also compatible with a variety of local AI deployment platforms such as vLLM, LM Studio, llama.cpp, Ollama, Mozilla-Llamafile, Jan Al, Cortex API, Local-LLM, LiteLLM, GPT4All, etc., to meet the diverse needs of different users.

β€’ Quick Setup of Local Model Service Authentication Middleware
Users can simply download the macOS, Windows, or Linux client to set up the authentication middleware for local model services with one click. There is no need for complex configuration, and the model service can be quickly started and secured.

β€’ Comprehensive Model Management Functions
It provides advanced functions such as model copying, deletion, pulling, updating, creation, and model quantization, and supports flexible parameter settings to meet the diverse needs of users ranging from individual developers to enterprise users.

β€’ Deeply Optimized Static Files
We have deeply optimized the built static files, which are ultimately merged into a single HTML file. With just one HTML file, powerful local AI API interaction capabilities can be achieved without complex deployment and ready-to-use.

β€’ Responsive Design, Mobile Compatible
It supports access from a variety of devices and is compatible with mobile devices, allowing users to start AI interaction experiences anytime, anywhere through their smartphones or tablets.

β€’ Security and Privacy Protection
All data processing is completed locally and will not be uploaded to the cloud or third-party servers, ensuring data security. After the initial load, users can use it offline without an internet connection and have full control over their data.

β€’ Online Usage
There is no need to install any programs. By visiting LocalAPI.ai, users can access the full range of functions. By simply configuring the browser and enabling cross-origin support, a smooth online AI interaction experience can be achieved in the browser.


Main Functions πŸš€

β€’ Intelligent Conversation
Interact with AI models using natural language to obtain intelligent answers and suggestions. Messages provide instant feedback, support saving chat history in the browser, and offer highly flexible parameter and prompt control.

β€’ Text Generation
It supports the generation of various types of text content to improve creative efficiency and has already supported some multimodal models for image content recognition. It provides instant performance metric feedback, intuitively displaying loading time, processing time, evaluation time, token generation speed, etc.

β€’ Model Management
It provides comprehensive model management functions, including advanced functions such as model copying, deletion, pulling, updating, creation, and model quantization, and supports flexible parameter settings to meet diverse needs.

β€’ Prompt Library
It offers a rich prompt library that users can freely import or export personal AI prompts. It supports online editing and one-click application to chat conversations, helping users to inspire creativity and improve the quality of AI output.


Use Cases 🎯

β€’ AI Enthusiasts
Provides an easy-to-use interactive experience suitable for users interested in AI.

β€’ Students and Beginners
Easy to get started, suitable for assisting learning and daily Q&A.

β€’ Individual Developers
Suitable for creative validation and small project development.

β€’ Enterprise-Level Applications
Although mainly aimed at individual users, its multi-model support and high performance also make it suitable for enterprise-level service applications.


🌐 Online Usage

There is no need to install any programs. By visiting LocalAPI.ai, users can access the full range of functions.
Users only need to simply configure the browser and enable cross-origin support to achieve a smooth online AI interaction experience in the browser.


πŸ“₯ Client Download

The tool supports the three major mainstream operating systems: macOS 🍎, Windows πŸ’», and Linux 🐧.
Users can start the service through the desktop application and run the service in the browser without solving cross-origin issues, achieving efficient local AI interaction.


πŸ› οΈ Deployment Guide

⚑ Quick Deployment Method

Download the LocalAPI_WEB.html file and open it with Firefox browser to directly access /LocalAPI_WEB.html.
Note: You need to refer to http://localapi.ai/tutorial to enable Ollama cross-origin support and temporarily disable the cross-origin restrictions of the Firefox browser.


⚑ Web Service Deployment Method

Deploy the LocalAPI_WEB.html file to a web service for quick access:

  1. Download the LocalAPI_WEB.html file.
  2. Or deploy the file to a web server and access /LocalAPI_WEB.html.

πŸ“Έ Function Demonstration

1. Support for macOS/Windows Desktop

Supports one-click activation of a secure authentication middleware for AI services, allowing usage through the browser without solving cross-origin issues.
image

Figure 1: macOS/Windows/Linux desktop, supporting one-click activation of authentication middleware


2. Intelligent Conversation Interface

Engage in natural conversations with AI models to obtain instant answers and insights.
image

Figure 2: Intelligent conversation interface, supporting natural language interaction and instant feedback


3. Advanced Text Generation and Image Recognition

image image Figures 3-4: Advanced text generation functions, supporting creation of various content types


4. Model Management System

Easily manage and configure multiple AI models to meet different needs.
image

Figure 5: Model management interface, supporting operations such as model copying, deletion, and updating


5. More Features to Discover

Reliable deployment methods, with reserved Nginx authentication interfaces and custom HTTP request headers to ensure complete privacy and security.
image

Figure 6: Deployment interface, supporting Nginx authentication interfaces and custom HTTP request headers to ensure data privacy and security


Summary

LocalAPI.AI is a powerful, secure, and easy-to-use local AI management tool.
It not only provides deep integration for Ollama but is also compatible with a variety of mainstream local AI deployment platforms. By enabling one-click setup of authentication middleware, comprehensive model management functions, and highly flexible parameter settings, it offers users an efficient, convenient, and secure experience.

Whether you are an AI enthusiast, student, developer, or enterprise user, you can quickly get started with LocalAPI.AI and fully utilize its powerful functions to achieve secure and efficient local AI interaction!


About the Source Code

Our LocalAPI_WEB.html is built with Vite + React + TypeScript as a static file, while LocalAPI.ai Desktop is developed using Python. Currently, most of the code is written by AI, and we are in the process of optimizing it. Once the optimization is complete, we will open-source the entire codebase. You can now edit or adjust LocalAPI_WEB.html using your preferred code editor.


Welcome to use LocalAPI.AI

β€’ GitHub Repository: https://github.com/vam876/LocalAPI.ai

β€’ Contact: [email protected]

β€’ Official Website: LocalAPI.ai

About

LocalAPI.AI is a local AI management tool for Ollama, offering Web UI management and compatibility with vLLM, LM Studio, llama.cpp, Mozilla-Llamafile, Jan Al, Cortex API, Local-LLM, LiteLLM, GPT4All, and more.

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages