A template for your kickstart into GenAI! 🎁
It combines Ollama LangChain/LangGraph, Jupyter, Open WebUI, Agent Chat UI and Agent Inbox!
To spin up all services run:
make run
This will spin up the following services:
localhost:8888/<TOKEN_HERE>
: JupyterLablocalhost:8080
: Open WebUIlocalhost:2024
: LangGraph Server APIlocalhost:3000
: Agent Chat UIlocalhost:3000
: Agent Inboxlocalhost:11434
: Ollama API
If you work with Jupyter, you can use the jupyter
folder to store your notebooks and other files.
You can add your requirements to the jupyter/requirements.txt
file, and they will be installed in the Jupyter container.
You can define your LangGraph graph in the langgraph_server
folder. The requirements are automatically installed from the langgraph_server/requirements.txt
file.
If you want to preload models, just add them to the ollama/models_to_preload.txt
file.