Server Migration


I had a network device go out on my server. There are two network cards, but this is second one to die, and this thing is nearly a decade old, has suffered through 3 power surges that even blew out surge protectors, has lost half of the controllers for the hard drive interfaces, internal video is shot, replaced power supply, one cpu socket continues to go offline every few days. It was running Fedora just fine, but I figured it was time for Fedora to get the F out, rather than just transplanting it into newer harder. I kept the old server so long because it was running Windows for a long time, and needed the dual quad core cpus and 8 RAM slots, 6 hd connections, etc. But Linux does not need that kind of horsepower, and would hardly ever hit more than 20% usage.

Now, we are looking at these ‘downgrades’: The power supply drops from 650 watts to 215 watts. The processor is an old dual-core Intel instead of two quad-core Xeons. Only 4 drive connections, which will let me have an optical drive, and the internal drive. All other storage will be external to the machine. The case will be a fraction of the size. Instead of a monstrosity with 8 drive bays, 12 fans, etc, the new one will be put inside a slim, horizontal Dell casing that will fit easily on a shelf.

It has been quite the adventure setting up yet another Linux distro, although Ubuntu has been so much more cooperative than bleeding-edge Fedora. I tried Debian first, but ran into a couple snags with some software not working properly, and a driver incompatibility, so Ubuntu it is. The initial setup was incredibly easy, just like always. After boot, there were a few times that the visual interface would freeze, but remote ssh was still running fine, so I switched from kde to gnome, thinking that might help. It did, but only in that it led to longer times between freezes, which were only remedied by a “sudo reboot now”. Gnome did help though, because after it would freeze up, the GUI would entirely disappear and it would fall back to a terminal, which displayed some information about the nouveau system not responding. Then I realized that it was the drivers for the video card, and installed nvidia-current and used those instead of nouveau. No more freezing.

The first thing I wanted to get set up was Webmin, for some easy, at-a-glance info from my phone or tablet. That went rather smoothly, but it did remind me that any future setups should have internet access while installing because my apt sources (/etc/apt/sources.list) only had two entries, and I had to manually add several that were missing, which allowed me to install the webmin dependencies.

Next came LAMP. Linux (already installed), Apache, MySQL, and PHP. All are necessary for nearly any type of server, so a few quick apt-get installs later, and they are up, no issues at all. Configure webmin to use MySQL, and to allow it to accept remote connections, and all is well there.

An attempt was made at installing a vncserver, but it wasn’t worth the effort, since i hardly ever use the GUI, and there are some issues when using the nvidia drivers and vnc or rdp. SSH works just fine, and is much easier to use anyway.

Since MySql was installed, Bacula was next. It is a little complicated compared to a lot of things, but with the help of webmin, it was mostly painless.

The DuckDNS script still needed installed, which would have needed the GUI, if you follow the instructions on their website, but instead, I just copied the script from the Fedora installation on the old drive…But wait…its encrypted….and…

Quick sidetrack, and we are loading cryptsetup and mounting the old drive in order to copy everything that needs pulled over. It turns out, that most file explorers like dolphin or nautilus seem to have an issue initially mounting them if you have the drive and the volume encrypted, and it locks the drive until the next reboot. So, command line it is again. Simple enough, and a few “cd”s, “cp”s and “chown”s later, and all the old data is safe in the new system, just not where it needs to be yet.

Back to DuckDNS: Copy the script, make a small modification to use two domains, set up the cron job, and voilà.

I thought about using Webacula to control Bacula, but it seemed like a lot of work for something that probably wouldn’t be used much. It needed to be installed, enabling some optional apache features, and then manually grab the Zend Framework, set that up with the proper references, blah, blah, blah. Webmin already had a usable interface for what I needed to do.

At this point, I thought I would take a break and work on updating a few projects I was working on, just for a change of pace. So I sit down, fire up TortoiseSVN and….no repositories found. That’s right, I never set up the Subversion server. So apt-get install subversion. Luckily, the old repositories were safely backed up, and after setting up subversion, I just plopped them where they should be, and there they were, like nothing every happened.

Which brings me to this latest issue: What is the appropriate backup method? Bacula was good for is there was lost data or something needed restored, but could I prevent myself from needing to go through all of this again if say, a hard drive fails? Granted, this time I had to since I was changing from Fedora to Ubuntu, but with as smoothly as this went, I don’t anticipate changing again without some groundbreaking reason.
There are options like partimage that can make a complete copy, but that doesn’t work with newer file systems like ext4. There are a few that are part of different distros that claim they back up entire drives, but most of them seem to just get the files and not the partitions or boot info. Then there is rssnapshot, which can do incremental disk images. And of course, Clonezilla, but that requires a live boot with the server being taken down.
It may seem like overkill, but I am leaning toward using Clonezilla about once a month (or any time there is a significant configuration change) to create a full system image, exactly copied to another drive, so that if necessary, I can just change drives and keep going. Then, using rssnapshot to keep a backup going on a second connected drive, and mirroring that backup data to another system on the network. Space is not an issue here, considering the full system is only about 35GB right now, with everything installed and the servers running with their data, and the local network has close to 10TB of available capacity. Additionally, Bacula will run backups of the vital systems such as the code repositories, MySQL databases, and stored user files. These will be stored on a network storage device located in a separate building, as well as be mirrored in cloud storage.

I welcome your comments, additions, etc here. It’s been quite some time since I’ve really used Linux, so I’m sure I’ve left some things out.

Leave a comment

Your email address will not be published.