A fellow thoughtbotter and I share a server on Digital Ocean, which is adorably called a “droplet” π§. Our personal websites are hosted on there, along with a couple of other sites being built by mutual friends.
Since the sites are relatively modest, we were able to keep the droplet size pretty small, until some performance-related issues started while we were trying to run commands on our server. Namely, trying to install Ghost to make our sites. (We will get to why this detail is important later. Don’t want to spoil the fun.)
Based on the size of our sites, it didn’t seem like we should have been dealing with performance problems at all. Thus, we decided to do some detective work and figure out what was going on.
Top it off
The first thing we did was run top
to figure out what resources were being
used and by which services. Though this might seem rote for people who spend a
lot of time in their terminals, the biggest takeaway for me, while running this
command, was that you can actually sort the results and choose which columns
you want to display.
To do this, first enter top
. Then, type Shift-F
. This will bring you to a
panel that looks like this:
As the screenshot shows, there are a lot of different columns to choose from,
depending on the data you need to clarify the problem you are observing. We
focused on sorting by %CPU
since we were curious about what was using the most
memory.
For us, mysql
earned the top spot in resource consumption, using over 37% of
our memory.
Hungry, Hungry mysql
So mysql
was having a party with our limited resources. What could we do? We
reasoned that the first thing we could try was limiting the resources the
database could use.
At the end of the day, we were more concerned about memory being taken up to aid
in faster mysql
processes than we were about mysql
running a smidgeon slower
than it had been. The reasoning behind this was due to the very small amounts of
traffic to our personal sites.
We found a useful article on how to optimize mysql
along
with a script that allowed us to get displays of memory statistics for the
database in one handy chart.
While we could see a detailed list here, we noticed that the numbers in this
table didn’t add up with what top
said mysql
was using: top
reflected
mysql
using significantly more data. The discovery then led us down a
different path.
Swap it to me
Now it got us both curious about the big picture. What exactly was going on at a system level? My colleague had the idea of checking how much swap space we were using.
Swap space, also known as “swap memory”, lends a hand when the physical memory (RAM) of a system is full. While it shouldn’t replace the option of adding more RAM to a system, since swap space is on the hard drive, that dip in speed wasn’t going to affect us.
There are a variety of commands that will show you how much swap space you
have and how much you are using, but we opted to use
free
. (If you append the flag -h
you will get results that translate
bytes into more human-readable numbers. E.g., 1000000000 becomes 1 Gi.)
The funny thing is that it turns out we weren’t using swap space at all. We had to set it up on our server ourselves. That ended up being very straightforward thanks to a tutorial by Digital Ocean.
As for the amount of swap space we needed to add, there was quite a bit of opinion. However, due to the small size of our server, we decided to go with a 1:1 ratio of RAM:swap.
Conclusions Drawn (and Quartered?)
To test our results, we ran updates on our server. Since nothing crashed and our
resources didn’t seem strained, we considered it a win for the time being. We
even got to downgrade our server back to a smaller size, saving everyone a
whopping $3/month. The real test will come when we run other commands on an as-needed basis, or when mysql
needs more juice. π§
Overall, though, I learned a lot about server management on a deeper level than I had previously been exposed to. Unlike working on a larger team where this work gets distributed, it’s just been me and my colleague video chatting on weekends to delve into a subject we don’t come across as often.
However, do you remember when I said we were installing Ghost for our sites and that this is where we noticed the performance issues? Well turns out pretty much everyone who used Ghost on small servers was noticing this. Know what their solution was? You guessed it.
Thanks, yarn
and npm
, for a fun afternoon of server debugging. π―