- install ollama
- or install it with
curl -fsSL https://ollama.com/install.sh | sh
# select a model from https://ollama.com/library
ollama pull phi3
# start the daemon
ollama serve
- install LM Studio
- download a model from the home screen, or use the search tab to pull from huggingface
- go to
Local server
tab, hitStart server
, and select your downloaded model
Groq offers a wide variety of models with a generous free tier.
Create and modify your own personal assistants!
Check out these collections for inspiration:
- 0xeb/TheBigPromptLibrary
- sockcymbal/persona_library
- abilzerian/LLM-Prompt-Library
- kaushikb11/awesome-llm-agents
Augment your conversation with the content of your (currently visited) web page.
- select
text mode
to share the text content of your page - select
html mode
to share the source code of the site (resource intensive, only for development purposes) - adjust
char limit
to control the maximum amount of characters you want to share in your conversation. decrease this amount if you have limited context window.
Basic web-augmentation for your chats. Enter your web search query, and sidellama will load up an async web search to answer your questions based on live public data.
- you can choose
duckduckgo
orbrave
as your web source - adjust
char limit
to control the maximum amount of characters you want to share in your conversation. decrease this amount if you have limited context window.