I know that for data storage the best bet is a NAS and RAID1 or something in that vein, but what about all the docker containers you are running, carefully configured services on your rpi, installed *arr services on your PC, etc.?

Do you have a simple way to automate backups and re-installs of these as well or are you just resigned to having to eventually reconfigure them all when the SD card fails, your OS needs a reinstall or the disk dies?

  • guitars are real@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 year ago

    The most useful philosophy I’ve come across is “make the OS instance disposable.” That means an almost backups-first approach. Everything of importance to me is thoroughly backed up so once main box goes kaput, I just have to pull the most recent copy of the dataset and provision it on a new OS, maybe new hardware if needed. These days, it’s not that difficult. Docker makes scripting backups easy as pie. You write your docker-compose so all config and program state lives in a single directory. Back up the directory, and all you need to get up and running again with your services is access to Docker Hub to fetch the application code.

    Some downsides with this approach (Docker’s security model sorta assumes you can secure/segment your home network better than most people are actually able to), but honestly, for throwing up a small local service quickly it’s kind of fantastic. Also, if you decide to move away from Docker the experience will give you insight into what amounts to program state for the applications you use which will make doing the same thing without Docker that much easier.