Fixing a Contabo Deepseek Server
You may have seen that Contabo are offering a one-click Deepseek server based on their VDS offering. These are semi-virtualised servers with dedicated processors and memory but shared storage, which makes them more capable than a regular shared VPS, but cheaper than a dedicated server.
Deepseek is, of course, the Chinese open source LLM provider that has effectively reduced the cost of AI by 95%, and its models will run on the open source ollama
platform. I’ve been tasked by a client to set up a proof of concept for basically document munging using cough AI, and this seemed like a good way to do it.
Setup is straightforward using Contabo’s charmingly 2000s looklng interface, and the server is ready pretty much immediately, with the root password you selected mailed to you (in plaintext). For the Deepseek one-click you also get a password for the Open-WebUI interface.
Logging into the server over ssh shows a banner that tells you that the Open-WebUI login is in a file called webui-password.txt
in the root directory. This isn’t the password they send you in the email. Ollama and Open-webUI run in podman and there’s a launch script in /root
called setup-webui.sh
. Caddy is used as a https proxy, and this is where the problem starts.
Fixing the problem
Attempting to connect to the server over HTTPS fails because caddy uses HTTPS by default and generates its own certificates using LetsEncrypt and of course the server doesn’t have a hostname to generate it from because there’s no way of creating a hostname in the Contabo control panel. To fix that, give the server a hostname in DNS, allow DNS to propagate and edit /etc/caddy/Caddyfile
as follows:
your.host.name
reverse_proxy localhost:3000
Run
caddy stop
caddy start
which reloads the config and requests a SSL certtificate.
Going to https://your.host.name
now should get you the Open-WebUI interface. The login requires an email address and password, except, as mentioned, you got a password for root from Contabo.
An email and password are in /root/webui-password.txt
, except in my case at least, the script interpreted the hostname of the server as <%=
so I got the username admin@<%=
, which unsurprisingly doesn’t work.
Add a hostname with hostnamectl set-hostname
. I added a fully qualified name but it’s likely that the interface only needs admin@hostname
. Modifying the script not to use admin
might be worth considering as well.
The script setup-webui.sh
is pretty straightforward: it creates the podman containers and sets a username and password in sqlite3
(it actually installs sqlite3, which suggests there might be a few insecure Open-webui installs around). After a bit of poking around, it seemed the quickest solution was to destroy the container and, volume and start again:
podman stop open-webui
podman container cleaup open-webui
podman volume rm open-webui
should do it.
Then run bash setup-webui,sh
and you should have a fresh install with a usable username and password.
Test this by logging into the web interface at https://your.host.name
Contabo provide the Deepseek models as advertised, but you can use the interface to install any model supported by ollama - I also installed mistral:8b.
A few notes
- on Contabo’s Cloud VDS L offering, with six physical AMD EPYC 7283 processors, queries still thrash them, with ollama showing over 500% CPU usage when a query is running.
- Deepseek is a reasoning engine, and it’s interesting to watch its response to queries, which too me seems preferable to ChatGPT’s invisible voice of authority
- Searching for AI related subjects on the web reveals a whole different universe and it’s not one I particularly like.
Obligatory disclaimer
Big ‘AI’ is massively abusive of physical and creative resources, and has been set up by the same tech- and former crypto-bros who have ultimately hitched their wagons to corporatism as a way of removing livelihoods from the majority of people while backing anti-human right wing ideologies that deny people the right to live well. However, ollama at least shows that more people can have access to LLMs, and Moore’s Law shows that eventually, models will be able to be run on your laptop and they can be placed in your control. It isn’t going to go away, so we need to find ways to make it work for us in the least exploitative way possible, and this is a start.