An interactive portfolio that leverages the Gemini API to provide a dynamic, conversational experience. This is not just a static portfolio; it's an interactive application where users can chat with an AI assistant to learn more about my work.
Traditional portfolios are static and passive. This project transforms the conventional portfolio into an engaging, interactive experience, allowing visitors to directly query an AI assistant about projects and skills, providing a deeper, more personalized understanding of my work.
- β¨ Features
- π οΈ Technology Stack
- ποΈ Architecture
- π§ͺ Testing
- π Documentation
- π API Access Model & Security
- π Quick Start
- π³ Docker
- πΈ Visual Demo
- π€ Contributing
- π License
- π Contact
- π€ Conversational AI Chatbot: Engage directly with an AI assistant powered by the cutting-edge Gemini API to explore projects and gain insights.
- π¨ Dynamic Project Showcase: A clean, modern interface designed to beautifully present diverse portfolio projects.
- π Intelligent Semantic Search: Leverage AI to semantically search for projects based on natural language queries, providing highly relevant results. This now includes a robust keyword fallback and graceful handling of API quota errors, ensuring search functionality remains available and user-friendly.
- π Seamless Contact & Feedback Integration: The chatbot is designed to intuitively guide users to an interactive contact or feedback form, simplifying communication and gathering valuable insights.
- π‘οΈ Enhanced Security: Implemented security headers for both development (via Vite) and production (via Cloudflare
_headers) to protect against common web vulnerabilities. - β‘ Improved Performance & UI: Fixed UI bugs, including the 'Flash of Unstyled Content' (FOUC), and optimized the initial page load performance.
- βΏ Enhanced Accessibility: The chatbot UI now includes ARIA attributes for improved accessibility, ensuring a better experience for all users.
- π Internationalization (i18n) Ready: The frontend is now prepared for internationalization, allowing for easy adaptation to multiple languages.
- πΎ Session-based Conversations: Chat history is automatically saved to
sessionStorage, ensuring continuity within a single browser tab and clearing upon tab closure. The full conversation history is now sent with each request to the worker, ensuring the AI model maintains context. - π€ Intuitive Voice Input: The application is designed to allow hands-free interaction with the chatbot using integrated voice-to-text functionality via the Web Speech API. Note: This feature is planned for future implementation.
- β‘ Offline Support (PWA Ready): The application is configured to register a service worker, laying the groundwork for Progressive Web App (PWA) features like offline access and faster loading.
- π Adaptive Light/Dark Mode: Personalize your viewing experience with a toggle for light and dark themes.
This project is built with a selection of modern and efficient technologies, chosen for their performance, flexibility, and developer experience.
- Frontend: TypeScript, HTML5, CSS3 (No framework, uses JavaScript template literals for HTML templating)
- AI Layer: Cloudflare Workers (secure API proxy, distributed KV-backed rate limiting, refined guardrails, embedding generation with caching, calling Google Gemini API directly), Google Gemini API (using
gemini-2.0-flashmodel,embedding-001model) - Testing: Playwright (for End-to-End testing), Vitest (for Worker unit testing)
- Speech Recognition: Web Speech API
The application is a client-side, single-page application (SPA) that interacts directly with the Google Gemini API from the user's browser.
For a better viewing experience on GitHub, the diagram is rendered from a .mmd file. It includes icons, which require Font Awesome to be available in the rendering environment.
flowchart LR
subgraph "Browser"
A[Vite SPA]
end
subgraph "Cloudflare"
B[Worker]
C[KV RATE_LIMIT_KV]
G[Guardrails]
T[Tools]
E[KV PROJECT_EMBEDDINGS_KV]
end
subgraph "Google Cloud"
D[Gemini API]
end
%% Connections
A -- "POST /chat (prompt, history)" --> B
A -- "POST /api/generateEmbedding" --> B
B -- "Auth & Rate Limit" --> C
B -- "Apply Guardrails" --> G
G -- "If safe, proceed" --> B
B -- "generateContent (with tools, history)" --> D
D -- "response (text/tool_call)" --> B
B -- "Execute Tool (e.g., projectSearch)" --> T
T -- "Tool Output (projects, notice)" --> B
B -- "Cache Query Embedding" --> E
E -- "Retrieve Project Embeddings" --> T
B -- "Streaming SSE (text/tool_response)" --> A
The Cloudflare Worker acts as a secure proxy and backend for AI-related functionalities, exposing the following key endpoints:
/chat: Handles conversational requests, forwarding them to the Gemini API, applying rate limiting, and enforcing guardrails to prevent sensitive content injection./api/generateEmbedding: Generates vector embeddings for text, also protected by rate limiting and guardrails. This endpoint is designed for internal use by the application (e.g., for semantic search) and not for direct client access.
- Technologies: Vanilla TypeScript, HTML, CSS.
- Responsibilities: Renders the main portfolio page, including the header, hero section, and project cards. It also provides the user interface for the chatbot, including the chat window, message history, and input form. All UI manipulation is handled directly via the DOM.
- Technologies: TypeScript.
- Responsibilities: This is the core of the application, running entirely in the user's browser.
- State Management: Manages the application state, such as the conversation history.
- AI Integration: Handles communication with the Cloudflare Worker, which processes and simplifies the Gemini API's raw response before sending a clean, structured response to the frontend.
- Orchestration Logic: Contains the logic to interpret user intent based on keywords.
- Data Persistence: Uses the browser's
localStorageto save and load the chat history. - Performance Optimization: The initial page load performance has been optimized to ensure a fast and smooth user experience.
- Project Data: Project information is sourced from
frontend/projects.tsand sent with each chat request to the worker. - Conversation History: Stored in a JavaScript array in memory during the session and persisted to
localStorage. - Vector Embeddings: Project embeddings for semantic search are generated by the Cloudflare Worker (via the
/api/generateEmbeddingendpoint) and cached in frontend memory (projectEmbeddings) on application load.
- Technologies: Docker, Nginx, GitHub Pages, Cloudflare Workers.
- Responsibilities: The application includes a multi-stage
Dockerfilefor containerization and is configured for automated deployment to GitHub Pages via GitHub Actions. The AI backend is deployed as a Cloudflare Worker.
Frontend Browser -> Cloudflare Worker -> Google Gemini API
β Enhanced Security: The
GEMINI_API_KEYandALLOWED_ORIGINSare securely stored as Cloudflare Worker secrets, preventing their exposure. TheVITE_WORKER_URLfor the frontend is stored as a **GitHub repository secret. This robust approach is suitable for production environments. The Cloudflare Worker also implements refined guardrails with an adjustedTRIPWIREregex to prevent false positives while maintaining strong protection against sensitive content injection. Security headers have been implemented for both development (via Vite) and production (via Cloudflare_headers`) to protect against common web vulnerabilities. The Content Security Policy (CSP) has been further hardened to mitigate XSS risks.
To ensure the reliability and quality of the application, a comprehensive testing strategy is employed:
- End-to-End (E2E) Testing with Playwright:
- Simulates real user interactions in a browser to validate the entire application workflow, including UI, application logic, and API integrations.
- Covers key scenarios like general conversation, contact form submission, rate limiting, and guardrail enforcement. It also includes comprehensive security tests to validate guardrails against various attack scenarios.
- All E2E tests are currently passing.
- To run E2E tests:
npx playwright test
- Worker Unit Testing with Vitest:
- Ensures the individual components and logic of the Cloudflare Worker function correctly.
- All worker unit tests are currently passing. A critical bug related to the Gemini model was recently identified and fixed, and all tests continue to pass after the resolution, ensuring the chatbot's stability.
- To run Worker unit tests:
npm test --prefix worker
-
Install dependencies:
- Run
npm installin the project root. - Run
npm install --prefix workerin the project root to install worker-specific dependencies.
- Run
-
Set up Environment Variables (Development):
-
In the
frontenddirectory, create a.env.localfile with the following content:VITE_WORKER_URL="http://localhost:8787"
-
In the
workerdirectory, create a.dev.varsfile with the following content:GEMINI_API_KEY="YOUR_GOOGLE_AI_STUDIO_KEY_HERE" ALLOWED_ORIGINS="http://localhost:5173,http://127.0.0.1:5173" RATE_LIMIT_KV_ID="YOUR_KV_NAMESPACE_ID_HERE" PROJECT_EMBEDDINGS_KV_ID="YOUR_KV_NAMESPACE_ID_HERE"
-
-
Set up Environment Variables (Production):
- Cloudflare Worker Secrets:
GEMINI_API_KEY: Your Google AI Studio key (set vianpx wrangler secret put GEMINI_API_KEY).ALLOWED_ORIGINS: Your GitHub Pages URL (e.g.,https://gmpho.github.io) (set vianpx wrangler secret put ALLOWED_ORIGINS).RATE_LIMIT_KV_ID: The ID of yourRATE_LIMIT_KVnamespace (set vianpx wrangler secret put RATE_LIMIT_KV_ID).PROJECT_EMBEDDINGS_KV_ID: The ID of yourPROJECT_EMBEDDINGS_KVnamespace (set vianpx wrangler secret put PROJECT_EMBEDDINGS_KV_ID).
- GitHub Repository Secret:
VITE_WORKER_URL: The URL of your deployed Cloudflare Worker (e.g.,https://ai-powered-static-portfolio-worker.<YOUR_ACCOUNT_NAME>.workers.dev) (set viagh secret set VITE_WORKER_URL).
- Cloudflare Worker Secrets:
-
Run the development servers:
-
In the project root, run:
npm run dev
-
-
Production Security Headers:
- For production deployments on Cloudflare Pages, a
_headersfile is included in thefrontend/publicdirectory. This file contains security headers that will be automatically applied by Cloudflare.
- For production deployments on Cloudflare Pages, a
For detailed troubleshooting, refer to the Debugging and Troubleshooting section in GEMINI.md and the Known Issues document for specific resolutions.
Containerize this application for consistent and isolated environments using Docker.
Build the image:
# The frontend Docker image does not require the API key.
docker build -t ai-portfolio .Run the container:
docker run -p 8080:80 ai-portfolioThe application will be available at http://localhost:8080.
Experience the interactive AI-powered portfolio in action:
Contributions are welcome! Please see the CONTRIBUTING.md for guidelines.
This project is licensed under the MIT License - see the LICENSE file for details.
For questions or feedback, please open an issue or contact me directly.
