In part 1 and part 2 of my article series, we looked at running Ollama, both on CLI and using the HTTP API.
In this short article here I’ll cover running Ollama as a Telegram bot so that you can have 1:1 chats with it or add it to your group chats to ask it questions or have it describe images!
We’ll be using a Telegram bot running in Docker, so we’ll start by creating the directory structure for docker compose:
mkdir -p /etc/docker-compose/ollama-telegram
chmod 0700 /etc/docker-compose/
chmod 0700 /etc/docker-compose/ollama-telegram
touch /etc/docker-compose/ollama-telegram/docker-compose.yml
touch /etc/docker-compose/ollama-telegram/environment
chmod 0600 /etc/docker-compose/ollama-telegram/docker-compose.yml
chmod 0600 /etc/docker-compose/ollama-telegram/environment
Now in /etc/docker-compose/ollama-telegram/docker-compose.yml
:
services:
ollama-telegram:
image: ruecat/ollama-telegram
container_name: ollama-telegram
network_mode: bridge
restart: unless-stopped
env_file:
- ./environment
pull_policy: always
extra_hosts:
- "host.docker.internal:host-gateway"
I should explain the docker compose file a little: We use network mode bridge so that we can still connect to the host, since that’s where Ollama is running and we specify host.docker.internal to be the host gateway, such that it will be able to resolve this to the host machine. This way we can reach Ollama on the host without needing to either set the network mode to host nor needing to hardcode IPs.
Now before we can continue setting up the environment file, you should fire up Telegram and talk to @BotFather and register a new bot and get a token for it, then you should talk to @getidsbot and get your Telegram user ID.
Now in /etc/docker-compose/ollama-telegram/environment
:
TOKEN=<Token you got from @BotFather>
ADMIN_IDS=<Your user ID you got from @getidsbot>
USER_IDS=<Your user ID you got from @getidsbot>
INITMODEL=llama3.2:latest
OLLAMA_BASE_URL=host.docker.internal
OLLAMA_PORT=11434
# Log level
# https://docs.python.org/3/library/logging.html#logging-levels
LOG_LEVEL=DEBUG
ALLOW_ALL_USERS_IN_GROUPS=1
Now that you did all that, we can simply start our bot with:
cd /etc/docker-compose/ollama-telegram/
docker compose up -d
Cool! Now go chat up the bot you just created and it should respond with llama3.2!
You can now add it to groups, too and mention it with @ so that it will respond and you can even send images while mentioning the bot and ask it to describe the image, e.g. but of course you’ll need a vision model for this like llava or llama3.2-vision.
Even though this is all very easy, I spent quite some time fine-tuning it all to make it as platform agnostic as possible to get you started easily! If you like what I do, feel free to donate a cup of coffee. Meow. :D
Leave A Comment