This is AI Chat APP bootstraped with LM Studio API
and Next.js
.
- Node.js (v18+ recommended)
- LM Studio running locally with accessible API endpoints.
- Environment variable file (.env) with:
NEXT_PUBLIC_LM_STUDIO_URL=http://192.168.100.7:1234
-
Clone this repository:
git clone https://github.com/nabinkhair42/snip.git
cd snip
-
Install dependencies:
pnpm install
or if you prefer yarn:
yarn install
-
Create a
.env
file at the project root and add:NEXT_PUBLIC_LM_STUDIO_URL=http://192.168.100.7:1234
To start the development server:
npm run dev
or using yarn:
yarn dev
Then open http://localhost:3000 in your browser.
-
src/app/
Contains Next.js pages and API routes.api/chat/route.ts
API route that sends chat requests to LM Studio and returns a streamed response.page.tsx
Main entry point for the UI chat application.
-
src/components/
Contains UI components:ChatHeader.tsx
– Header showing the app title, model information, theme toggle, and clear conversation.ChatWindow.tsx
– Displays the chat messages.ChatInput.tsx
– Input field for sending messages.ChatMessage.tsx
– Displays individual chat messages with markdown formatting.extended/
– Components for extended functionality (e.g., ThemeToggle, ClearConversation).ui/
– Basic UI components like Button, Input, Badge.
- Modify the LM Studio API URL by updating the environment variable in your
.env
file. - Customize UI components under src/components/ as needed.
-
Ollama
API integration for more conversational chat. -
Code Snippet
for code generation -
AI Model
Switching Feature