This is the official frontend user interface component for NeMo Agent Toolkit, an open-source library for building AI agents and workflows.
This project builds upon the work of:
- chatbot-ui by Mckay Wrigley
- chatbot-ollama by Ivan Fioravanti
- π¨ Modern and responsive user interface
- π Real-time streaming responses
- π€ Human-in-the-loop workflow support
- π Light/Dark theme
- π WebSocket and HTTP API integration
- π³ Docker support
- NeMo Agent Toolkit installed and configured
- Git
- Node.js (v18 or higher)
- npm or Docker
Clone the repository:
git clone [email protected]:NVIDIA/NeMo-Agent-Toolkit-UI.git
cd NeMo-Agent-Toolkit-UI
Install dependencies:
npm ci
npm run dev
The application will be available at http://localhost:3000
# Build the Docker image
docker build -t nemo-agent-toolkit-ui .
# Run the container with environment variables from .env
# Ensure the .env file is present before running this command.
# Skip --env-file .env if no overrides are needed.
docker run --env-file .env -p 3000:3000 nemo-agent-toolkit-ui
The application supports configuration via environment variables in a .env
file:
Application Configuration:
NEXT_PUBLIC_WORKFLOW
- Application workflow name displayed in the UINEXT_PUBLIC_SERVER_URL
- Backend server URL for HTTP API requestsNEXT_PUBLIC_WEBSOCKET_URL
- WebSocket server URL for real-time connectionsNEXT_PUBLIC_WEBSOCKET_PATH
- WebSocket endpoint path
Feature Toggles:
NEXT_PUBLIC_WEB_SOCKET_DEFAULT_ON
- Enable WebSocket mode by default (true/false)NEXT_PUBLIC_CHAT_HISTORY_DEFAULT_ON
- Enable chat history persistence by default (true/false)NEXT_PUBLIC_RIGHT_MENU_OPEN
- Show right menu panel by default (true/false)NEXT_PUBLIC_ENABLE_INTERMEDIATE_STEPS
- Show AI reasoning steps by default (true/false)
Optional Configuration:
DEFAULT_MODEL
- Default AI model identifier for server-side renderingMAX_FILE_SIZE_STRING
- Maximum file upload size for all operations (e.g., '5mb', '10mb', '1gb')NODE_ENV
- Environment mode (development/production) affects security settingsNEXT_TELEMETRY_DISABLED
- Disable Next.js telemetry data collection (1 to disable)
Settings can be configured by selecting the Settings
icon located on the bottom left corner of the home page.
Appearance:
Theme
: Switch between Light and Dark mode
API Configuration:
HTTP Endpoint
: Select API endpoint type:- Chat Completions β Streaming - Real-time OpenAI Chat Completions compatible API endpoint with streaming responses
- Chat Completions β Non-Streaming - Standard OpenAI Chat Completions compatible API endpoint
- Generate β Streaming - Text generation with streaming
- Generate β Non-Streaming - Standard text generation
Optional Generation Parameters
: OpenAI Chat Completions compatible JSON parameters that can be added to the request body (available for chat endpoints)
WebSocket Configuration:
WebSocket Schema
: Select schema for real-time connections:- Chat Completions β Streaming - Streaming chat over WebSocket
- Chat Completions β Non-Streaming - Non-streaming chat over WebSocket
- Generate β Streaming - Streaming generation over WebSocket
- Generate β Non-Streaming - Non-streaming generation over WebSocket
Note: For intermediate results streaming, use Chat Completions β Streaming (/chat/stream
) or Generate β Streaming (/generate/stream
).
- Set up NeMo Agent Toolkit following the getting started guide
- Start workflow by following the Getting Started Examples
nat serve --config_file=examples/getting_started/simple_calculator/configs/config.yml
Interact with the chat interface by prompting the agent with the message:
Is 4 + 4 greater than the current hour of the day?
- Set up NeMo Agent Toolkit following the getting started guide
- Start workflow by following the HITL Example
nat serve --config_file=examples/HITL/simple_calculator_hitl/configs/config-hitl.yml
Enable WebSocket mode in the settings panel for bidirectional real-time communication between the client and server.
- Send the following prompt:
Can you process my input and display the result for the given prompt: How are you today?
- Enter your response when prompted:
- Monitor the result:
The UI supports both HTTP requests (OpenAI Chat compatible) and WebSocket connections for server communication. For detailed information about WebSocket messaging integration, please refer to the WebSocket Documentation in the NeMo Agent Toolkit documentation.
This project is licensed under the MIT License - see the LICENSE file for details. The project includes code from chatbot-ui and chatbot-ollama, which are also MIT licensed.