Jimmy is sentient! Ollama-integrated discord bot
nexy7574
954d01bca5
All checks were successful
Build and Publish / build_and_publish (push) Successful in 50s
|
||
---|---|---|
.gitea/workflows | ||
jimmy | ||
.dockerignore | ||
.gitignore | ||
docker-compose.yml | ||
Dockerfile | ||
README.md | ||
requirements.txt |
Sentient Jimmy
Another Ollama bot for discord, however designed for mesh self-hosting.
Example config.toml
[bot]
token = "your-bot-token"
debug_guilds = [0123456789] # omit for global commands
[ollama]
order = ["server1", "server2", "fallback"]
# ^ order of preference for Ollama servers. If server1 is offline, server2 will be tried, and so on
[ollama.server1]
base_url = "https://hosted.ollama.internal" # default port is 443, because HTTPS
gpu = true
vram_gb = 8
[ollama.server2]
base_url = "http://192.168.1.2:11434"
gpu = true
vram_gb = 4 # <8GB will enable "low VRAM mode" in ollama
[ollama.fallback]
base_url = "http://192.168.1.250:11434"
gpu = false
vram_gb = 32 # in the case of CPU Ollama, "vram" is actually just regular RAM.