Nexus d27dabbc36
All checks were successful
Build and Publish Jimmy.2 / build_and_publish (push) Successful in 7s
2024-05-06 02:20:00 +01:00

3.1 KiB

College Bot V2

A continuation of LCC-Bot for our college discord.

Note that this entire bot is satirical, especially the AI. Views expressed in documents in this repository may not necessarily be my actual views, with the intention of being satirical (a lot of the time I'm mocking politicians).

Installing & Running


Prequisites are:


(While you can run this without docker, the project is heavily optimised for a docker-compose stack.)

  • A machine with docker
  • A CPU with at least 64 bits, a core, and a clock speed that's at least more than 1Hz. ARM is not preferred however should work.
  • At least 256 Megabytes of RAM (mostly for the host OS & docker, though this project is python so do with that what you will)
  • I'd allocate at least 5GB of your disk to this process, but really its not needed. The largest consumer will be chrome for /screenshot.


Ollama is included in the docker compose stack, which enables the /ollama command. If you do not want to use this server, you should omit it from your config.toml. Otherwise, unless you're shoving a GPU into your docker container, you should expect insane CPU usage.

  • Over 50 gigabytes disk space (for multiple models)
  • A CPU that has at least 4 cores and runs at at least 2GHz (for some semblance of speed)
  • 8GB or more RAM
  • (Optional) NVIDIA GTX 1060 or AMD 6600 or newer (5th generation and older cards do not support ROCM)


This command uses chromium & chromedriver (via selenium) to take screenshots of navigated pages. This command is not suitable to be run on a low-power VPS

  • 3GB Free RAM (maybe more for heavy pages)
  • 2+ CPU Cores
  • 10GB+ disk


The autoresponder has a few features, but one you should keep in mind is automated transcoding. If you want to enhance the transcoding speed (and lower the load on your CPU), you should pass through /dev/dri to your docker container. Alternatively, you can disable/configure it in config.toml (see below).

If you are going to use transcoding with hardware acceleration (/dev/dri), you should have at least a 6th generation intel CPU (or any other CPU that has HEVC 8-bit/better acceleration).


Jimmy v2 features the yt-dl command also, however a lot of the youtube functionality will be hindered if you host the bot in a cloud server. It is recommended you either host jimmy on a homeserver, or proxy your cloud server back to your residential connection, so that google does not block you.


All possible configuration options are in config.example.toml. Copy this to config.toml and edit it, and off you pop.


docker-compose.yml is provided. Use docker compose up. The latest image will be pulled automatically.

As the image is quite large, and my upload speed is... not, you may want to build the image yourself.

You can clone this repository, then run docker built -t ., and then docker compose up -d as normal. Alternatively, you can edit the compose file, replacing image: with build: ..