This project combined product usability with production concerns. The goal was not only to make an LLM chat interface work, but to make it secure, stateful, and useful for real internal knowledge tasks. I worked across the stack, shaping backend APIs, data persistence, and the web experience so the system could handle authenticated users, uploaded files, and retrieval-backed conversations in one coherent flow.
Project case study
Full-Stack AI Chatbot
AI chatbot web app with a clean web interface and a secure backend. Users can sign in, create projects, upload files, and chat in real-time with fast LLM models provided by OpenRouter, while provider API keys stay safely on the server. It’s built for reliability and growth with streaming responses, rate limiting, usage tracking, and an event-driven architecture for scalability.
Key outcomes
- Implemented JWT and OAuth2-based authentication for protected user sessions.
- Added file upload and retrieval workflows to ground model responses to the file context.
- Connected a responsive chat interface to real-time LLM conversations through OpenRouter.