A modern, secure restaurant AI assistant built with React and Node.js, featuring an intelligent chatbot powered by Google Gemini AI. The application provides a complete restaurant experience with menu browsing, cart management, order tracking, and AI-powered customer support.
- π€ AI-Powered Chatbot: Intelligent conversation assistant using Google Gemini AI
- π½οΈ Interactive Menu: Browse restaurant menu with categories and dietary filters
- π Smart Cart System: Add/remove items with real-time updates
- π± Responsive Design: Mobile-first design that works on all devices
- π Secure Architecture: API keys kept server-side, never exposed to frontend
- β‘ High Performance: Optimized with React memoization and backend caching
- π‘οΈ Security First: CORS protection, rate limiting, input validation
- π Real-time Updates: Live order tracking and status updates
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β UI β β BFF β β AGENT β
β (React/Vite) βββββΊβ (Node/Express) βββββΊβ (TypeScript) β
β Port: 3000 β β Port: 3001 β β β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β β β
β β β
β ββββββββββββββββββββββ β
βΌ βΌ βΌ
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β REST Backend β β MCP β β AI MODEL β
β (Direct Call) β β (Tools) β β (Gemini AI) β
β Port: 8000 β β Port: 8000 β β β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
- UI β BFF: User interactions (chat, menu browsing, cart operations)
- BFF β Agent: Chat messages processed by Gemini AI with MCP tools
- Agent β MCP: AI calls restaurant tools for menu/cart/order data
- UI β REST: Direct API calls for menu, cart, and order operations
-
UI (
restaurant-ui/): React + TypeScript + Vite- Modern responsive interface
- Real-time chat with AI assistant
- Menu browsing and cart management
- Order tracking and checkout
-
BFF (Backend for Frontend) (
restaurant-bff/): Node.js + Express + TypeScript- API gateway for Agent interaction
- Chat processing with session management
-
Agent (
restaurant-bff/server/src/agent/): AI processing layer- Google Gemini AI integration
- MCP client for tool calling
- OAuth authentication handling
- Session and conversation management
-
MCP Server (
restaurant-mcp-server/): Model Context Protocol server- AI tools for restaurant operations
- REST API for direct data access
- OAuth 2.0 authentication
- Menu, cart, and order management
- Also contatin RESTBackend: API server
- Provides REST endpoints
- Handles menu data, cart sessions, orders
- Secure authentication and authorization
The Restaurant AI Assistant uses a combined MCP Server that provides both AI tools via the Model Context Protocol and REST API endpoints for direct data access. The MCP server acts as the central backend service handling restaurant business logic.
UI (Direct) β REST API β Restaurant Data
UI (Chat) β BFF β Agent β MCP Server β Restaurant Data
The MCP server provides the following tools that the AI can use for intelligent restaurant operations:
The MCP server serves dual purposes:
- Menu Management:
get_menu_categories,list_items_by_category,get_item_details,find_items_by_criteria - Cart Operations:
create_cart,add_to_cart,remove_from_cart,get_cart - Order Management:
checkout,get_order_status,list_orders,get_my_orders,add_order_note
- Menu:
GET /api/menu/categories,GET /api/menu/items - Cart:
POST /api/cart,GET /api/cart/:sessionId,POST /api/cart/:sessionId/items - Orders:
POST /api/orders,GET /api/orders/:orderId - Health:
GET /health,GET /api-docs
The MCP server implements enterprise-grade security:
- OAuth 2.0 Integration: Supports Asgardeo and WSO2 Identity Server
- Protected Endpoints: All MCP operations require authentication
- Session Management: Secure session handling with automatic cleanup
- CORS Configuration: Controlled cross-origin access
- JWT Processing: Token validation and user context
-
Navigate to MCP Server Directory:
cd restaurant-mcp-server -
Install Dependencies:
npm install
-
Configure Environment:
cp .env.example .env # Create if doesn't existUpdate
.envwith your authentication settings:# Identity Provider Configuration BASE_URL=https://api.asgardeo.io/t/your-tenant # OR for WSO2 Identity Server: # BASE_URL=https://localhost:9443 # Server Configuration PORT=8000 NODE_TLS_REJECT_UNAUTHORIZED=0 # Development only
-
Start MCP Server:
# Development mode npm run dev # Production mode npm run build npm start
The backend automatically connects to the MCP server at http://localhost:8000/mcp. Configure the connection in the backend's .env:
MCP_SERVER_URL=http://localhost:8000/mcpHealth Check:
curl http://localhost:8000/healthMCP Endpoint (Protected):
curl -H "Authorization: Bearer <token>" \
http://localhost:8000/mcpAvailable Tools List:
curl -H "Authorization: Bearer <token>" \
-X POST http://localhost:8000/mcp \
-d '{"jsonrpc": "2.0", "id": 1, "method": "tools/list"}'- Customer Inquiry: "What's on the menu?"
- AI Processing: AI receives query via chat endpoint
- MCP Tool Call: AI calls
get_menu_categoriestool - Data Retrieval: MCP server fetches categories from restaurant data
- Response Generation: AI formats response with menu information
- Customer Response: "We have Appetizers, Main Courses, Desserts, and Beverages..."
- @modelcontextprotocol/sdk: Core MCP protocol implementation
- @asgardeo/mcp-express: Authentication middleware
- express: Web server framework
- jsonwebtoken: JWT token handling
- swagger-ui-express: API documentation (optional)
- Node.js 18+ (Download)
- npm or yarn package manager
- Google AI API Key (Get from Google AI Studio)
- Identity Provider (Asgardeo or WSO2 Identity Server) for MCP authentication
- Create an MCP Client application with appropriate redirect URIs (e.g.,
http://localhost:3001/api/oauth/callback) - Use the respective client ID in the backend
.env - Enable App-Native Authentication from the "Advanced" Settings tab in the MCP Client configuration.
- Create an MCP Client application with appropriate redirect URIs (e.g.,
# Clone the repository
git clone <repository-url>
cd restaurant-ai-assistant
# Run the automated setup script for backend
./setup-backend.sh
# Setup MCP Server (provides REST API + MCP tools)
cd restaurant-mcp-server
npm install
cp .env.example .env # Configure authentication
npm run dev
# In another terminal, start the BFF (Backend for Frontend)
cd ../restaurant-bff/server
npm run dev
# In another terminal, start the frontend
cd ../../restaurant-ui
npm run devcd restaurant-mcp-server
npm install
cp .env.example .env
# Configure BASE_URL for your identity provider
npm run dev # Runs on port 8000cd restaurant-bff/server
npm install
cp .env.example .env
# Configure GOOGLE_AI_API_KEY and MCP_SERVER_URL
npm run dev # Runs on port 3001cd restaurant-ui
npm install
npm run dev # Runs on port 3000- Frontend: http://localhost:3000
- BFF API: http://localhost:3001/api
- REST Backend: http://localhost:8000/api
- MCP Tools: http://localhost:8000/mcp
- API Docs: http://localhost:8000/api-docs
# Required
GOOGLE_AI_API_KEY=your_google_ai_api_key_here
MCP_SERVER_URL=http://localhost:8000/mcp
BACKEND_API_URL=http://localhost:8000/api
# Optional
PORT=3001
NODE_ENV=development
ALLOWED_ORIGINS=http://localhost:5173,http://localhost:3000
# OAuth Configuration (if using MCP with OAuth)
OAUTH_CLIENT_ID=your_oauth_client_id
OAUTH_CLIENT_SECRET=your_oauth_client_secret
OAUTH_REDIRECT_URI=http://localhost:3001/api/oauth/callback# Identity Provider Configuration (choose one)
# For Asgardeo:
BASE_URL=https://api.asgardeo.io/t/your-tenant
# For WSO2 Identity Server:
# BASE_URL=https://localhost:9443
# Server Configuration
PORT=8000
NODE_TLS_REJECT_UNAUTHORIZED=0 # Development only - disable SSL verification
# Optional: Custom MCP Resource Identifier
# MCP_RESOURCE=http://localhost:8000/mcp# BFF (Backend for Frontend) - for chat functionality
VITE_BFF_BASE_URL=http://localhost:3001/api
# REST Backend (MCP Server) - for menu, cart, orders
VITE_API_BASE_URL=http://localhost:8000/api| Method | Endpoint | Description |
|---|---|---|
GET |
/api/health |
Health check with system status |
POST |
/api/chat |
Send chat message to AI assistant |
GET |
/api/oauth/callback |
OAuth authorization callback |
| Method | Endpoint | Description | Auth Required |
|---|---|---|---|
GET |
/health |
MCP server health check | β |
GET |
/api-docs |
Swagger API documentation | β |
GET |
/api/menu/categories |
Get menu categories | β |
GET |
/api/menu/items |
Get menu items with filters | β |
POST |
/api/cart |
Create shopping cart | β |
GET |
/api/cart/:sessionId |
Get cart contents | β |
POST |
/api/cart/:sessionId/items |
Add item to cart | β |
PUT |
/api/cart/:sessionId/items/:itemId |
Update cart item | β |
DELETE |
/api/cart/:sessionId/items/:itemId |
Remove cart item | β |
POST |
/api/orders |
Place new order | β |
GET |
/api/orders/:orderId |
Get order details | β |
POST |
/mcp |
MCP protocol endpoint for AI tools | β OAuth |
curl -X POST http://localhost:3001/api/chat \
-H "Content-Type: application/json" \
-d '{"message": "What are today'\''s specials?", "sessionId": "optional-session-id"}'Response:
{
"success": true,
"response": "Today we have several specials including...",
"sessionId": "generated-or-provided-session-id",
"timestamp": "2024-01-15T10:30:00.000Z"
}curl http://localhost:3001/api/healthResponse:
{
"status": "healthy",
"timestamp": "2024-01-15T10:30:00.000Z",
"service": "restaurant-ai-backend",
"version": "1.0.0",
"environment": "development",
"sessions": {
"active": 5,
"maxAge": "24 hours"
}
}npm run dev # Start development server with hot reload
npm run build # Build TypeScript to JavaScript
npm run start # Start production server
npm run lint # Run ESLint code quality checks
npm run type-check # Run TypeScript type checkingnpm run dev # Start development server with hot reload
npm run build # Build TypeScript and copy assets
npm run start # Start production server
npm run clean # Remove build artifacts
npm run build:watch # Watch mode compilationnpm run dev # Start Vite development server
npm run build # Build for production
npm run preview # Preview production build locally-
Start MCP Server (provides REST API and AI tools):
cd restaurant-mcp-server npm run dev # Runs on port 8000
-
Start BFF Server (handles chat and API proxying):
cd restaurant-bff/server npm run dev # Runs on port 3001
-
Start Frontend (React UI):
cd restaurant-ui npm run dev # Runs on port 3000
-
Test Integration:
# Test MCP/REST server health curl http://localhost:8000/health # Test BFF health curl http://localhost:3001/api/health # Test chat with AI assistant curl -X POST http://localhost:3001/api/chat \ -H "Content-Type: application/json" \ -d '{"message": "Show me the menu"}' # Test direct REST API call curl http://localhost:8000/api/menu/categories
- API Key Protection: Google AI keys stored server-side only
- CORS Configuration: Restricted to allowed origins
- Rate Limiting:
- Chat endpoint: 30 requests per 15 minutes
- General API: 100 requests per 15 minutes
- Input Validation: Message length limits and sanitization
- Request Timeouts: 60s for chat, 30s for other requests
- Security Headers: Helmet.js protection enabled
- Error Handling: No sensitive information leaked in responses
# Backend quality checks
cd restaurant-bff/server
npm run lint
npm run type-check
# Frontend quality (handled by Vite)
cd restaurant-ui
npm run build # Includes ESLint checks- TypeScript: Strict type checking enabled
- ESLint: Configured for both frontend and backend
- Prettier: Code formatting consistency
- Security Audit: Regular dependency vulnerability checks
# Check if MCP server is running
curl http://localhost:8000/health
# Verify REST API endpoints are accessible
curl http://localhost:8000/api/menu/categories
# Verify MCP endpoint is accessible (requires auth)
curl -H "Authorization: Bearer <token>" http://localhost:8000/mcp
# Check MCP server logs for authentication errors
cd restaurant-mcp-server && npm run dev# Check if BFF is running
curl http://localhost:3001/api/health
# Verify MCP server URL configuration
grep MCP_SERVER_URL restaurant-bff/server/.env
# Check BFF logs for connection errors
cd restaurant-bff/server && npm run dev# Verify API key is valid in BFF
grep GOOGLE_AI_API_KEY restaurant-bff/server/.env
# Check BFF logs for API errors
cd restaurant-bff/server && npm run dev
# Ensure MCP service is running on port 8000
curl http://localhost:8000/health
# Test MCP tool availability (requires auth)
curl -X POST http://localhost:8000/mcp \
-H "Authorization: Bearer <token>" \
-d '{"jsonrpc": "2.0", "id": 1, "method": "tools/list"}'# Test direct REST API calls
curl http://localhost:8000/api/menu/categories
# Check MCP server logs
cd restaurant-mcp-server && npm run dev
# Verify frontend is configured to call correct API
grep VITE_API_BASE_URL restaurant-ui/.env# Find what's using the ports
lsof -i :3000 # Frontend
lsof -i :3001 # BFF (Backend for Frontend)
lsof -i :8000 # MCP/REST Server
# Kill process if needed
kill -9 <PID>Enable debug logging by setting:
NODE_ENV=development
DEBUG=restaurant-ai:*We welcome contributions! Please follow these steps:
- Fork the repository
- Create a feature branch:
git checkout -b feature/your-feature-name - Make your changes with tests
- Run quality checks:
cd restaurant-bff/server && npm run lint && npm run type-check cd ../../restaurant-ui && npm run build
- Commit your changes:
git commit -m "Add your feature" - Push to your branch:
git push origin feature/your-feature-name - Create a Pull Request
- Follow existing code style and patterns
- Add TypeScript types for new features
- Update documentation for API changes
- Test both frontend and backend functionality
- Ensure security best practices are followed
- MCP Server Documentation: Comprehensive guide for the Model Context Protocol server
- API Documentation: Complete OpenAPI specification
- Secure Backend Guide: Security implementation details
- Proxy Implementation: Backend proxy architecture
This project is licensed under the MIT License - see the LICENSE file for details.
- Google Gemini AI for powering the intelligent chatbot
- Model Context Protocol for AI service integration
- React & Vite for the modern frontend framework
- Express.js for the robust backend framework
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Documentation: See
api-docs.yamlfor complete API specification
Happy coding! π½οΈπ€
This project maintains high code quality standards:
- ESLint: Configured for both frontend and backend with TypeScript support
- TypeScript: Strict type checking enabled
- Code Linting: Automated code quality checks
- Security: No known vulnerabilities (regular audit checks)
- Performance: Optimized React components with memoization
# Backend linting
cd server
npm run lint
# Frontend is handled by Vite's built-in ESLint integration- Ensure backend is running on port 3001
- Check that frontend origin is allowed in CORS configuration
- Verify no firewall blocking localhost connections
- Check that
.envfile exists andGOOGLE_AI_API_KEYis set - Ensure all dependencies are installed:
npm install - Check for TypeScript compilation errors:
npm run build
- Verify your Google AI API key is valid
- Check backend logs for API errors
- Ensure MCP service is running (if used)
- Fork the repository
- Create a feature branch
- Make your changes
- Test both frontend and backend
- Submit a pull request
MIT License - see LICENSE file for details.