Debugging a slow server with AI
For the past week (early Dec 2025), I’ve been having a massive frustration with a client’s server; I just couldn’t figure out .
The issue started subtly. The site felt heavy. Sluggish. We assumed it was a recent update to how we handle video archives, but the maths didn't look right. Being a systems thinker means looking for the root cause, not just treating the symptom.
I worked through it using Gemini to comb through the running processes by pasting in a screenshot that comes from running htop (lists the processes, CPU and RAM usage). The culprit? Not our code, but a vulnerability in a self-hosted analytics tool, Umami, we were running. Well, actually I should name and shame Umami like that - it was actually a React component package, which in turn was used by NextJS, which in turn was used by Umami.
Someone had managed to install a cryptominer (the root vulnerability in React), quietly siphoning off processing power in the background until the server crashed.
Frankly, AI made this fix happen.
I was close to re-provisioning the server — which isn’t hard or particularly time consuming (1-2 hours) — but I was pretty confident there were performance issues from my code.
It’s also a reminder of a core business truth: complexity has a hidden cost.
We were self-hosting this analytics tool to avoid a monthly subscription. On paper, it saved money. In practice, it cost us time, server stability, and a fair bit of stress. We’re considering a move to Tinylytics for that project as I’ve been using that lately (this site included) and it’s a great app. It’s a paid service, but it’s managed by someone else. It runs on their servers, not ours so it removes the maintenance burden almost entirely.
Sometimes, the most "optimised" choice isn't the cheapest one—it’s the one that protects your time and keeps things calm. And sometimes increasingly AI does help.