AI Chat
Last Updated: September 9, 2025
This document describes the technical specifications, architecture, and key components of SurfAI's AI Chat feature.
1. Feature Overview
The AI Chat feature allows users to converse directly with a Large Language Model (LLM) within the SurfAI platform. Users can input anything from simple questions to complex requests, and the AI provides a stateless response without remembering previous parts of the conversation.
This feature is provided as a feature card on the /surf page and is activated via a modal window upon clicking.
2. Architecture and Data Flow
The AI Chat feature is designed with an architecture where the existing backend (comfy-surfai-backend) acts as an API gateway, calling a separate Python server (comfy-langchain) that handles the actual LLM logic. This approach allows for centralized authentication while separating features into microservices for better maintainability.
3. Key Components
A. Frontend (ChatModal.tsx)
- Location:
src/components/common/ChatModal.tsx - Role: A reusable modal component that handles the entire UI and user interaction for the AI chat feature.
- Key Features:
- Includes an
Inputfor user prompts and a 'Send'Button. - Calls the backend's
/langchain/chatendpoint viaapiClient. - Manages API request states (loading, error, etc.) and provides feedback to the user.
- Displays the AI's response in a scrollable area.
- Includes an
B. NestJS Backend (LangchainModule)
- Location:
src/langchain/ - Role: Acts as a mediator and gateway between the frontend and the FastAPI server.
- Key Features:
LangchainController: Defines thePOST /langchain/chatendpoint and protects it withJwtAuthGuardto allow only logged-in users.LangchainService: UsesHttpServiceto make internal API calls to the FastAPI server. It includes theX-Internal-API-Keyin the header for server-to-server authentication.
C. FastAPI Backend (comfy-langchain)
- Location:
comfy-langchain/app/ - Role: A Python server that uses the
LangChainlibrary to handle the actual LLM calls. - Key Features:
security.py: Verifies theX-Internal-API-Keyfrom the NestJS backend to process only authorized requests.services.py: Initializes theChatOpenAImodel vialangchain_openaiand generates a response for a given prompt.api.py: Defines the/api/v1/chatendpoint and connects the security and service logic.