-
Notifications
You must be signed in to change notification settings - Fork 36
Issues: mostlygeek/llama-swap
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Ability to customize for koboldcpp
support
support requests
wontfix
This will not be worked on
#113
opened May 4, 2025 by
arbitropy
Query - How to execute support requests
docker run
commands to spin up other docker containers (TabbyAPI, different branch of llama.cpp, etc)?
support
#110
opened May 2, 2025 by
MikeNatC
arm64 docker release
containers
llama-swap docker container related
#104
opened Apr 29, 2025 by
albertopasqualetto
[Bug] Terminating llama-swap doesn't always kill child process
posix
darwin, linux, bsd related issues
support
support requests
#100
opened Apr 23, 2025 by
FullstackSensei
[Feature Request] Add delay period in yaml configuration between running successive configurations
configuration
related to configuration of llama-swap
#97
opened Apr 22, 2025 by
FullstackSensei
Ability to point to a folder containing multiple models for quick configuration
configuration
related to configuration of llama-swap
#95
opened Apr 20, 2025 by
spikegee
Docker volume mount instructions don't match where models are saved when using example config
stale
#90
opened Apr 7, 2025 by
Jordanb716
Revamp Frontend (placeholder)
enhancement
New feature or request
#80
opened Mar 24, 2025 by
mostlygeek
Feature Request: Ability to run llama-swap.exe as a Windows Tray Application
enhancement
New feature or request
windows
windows related issues
#74
opened Mar 18, 2025 by
jhemmond
ProTip!
Type g i on any issue or pull request to go back to the issue listing page.