• 0 Posts
  • 26 Comments
Joined 1 year ago
cake
Cake day: July 22nd, 2023

help-circle


  • I work night shift and use blackout curtains and earplugs to improve my sleep during the day. Rather than cranking the volume on my alarm so it’s loud enough to consistently wake me up, I use Home Assistant to turn on some smart bulbs as my alarm. When I started, and even now if I have to be up extra early, I also have an audible alarm set to go off a few minutes after the lights come on - just in case the light doesn’t wake me up, but at this point my brain has gotten used to waking up to the lights, and I usually wake up and turn off the other alarm before it goes off.

    Another useful automation for me is I have a buggy Samsung PC monitor that has all sorts of annoying issues; like not consistently waking from deep sleep which requires a hard power cycle to correct, and when it is asleep there’s some weird high pitched whine that beeps when the standby light flashes. I use a couple of smart plugs with power monitoring and monitor my PCs power draw to turn the power to my monitor on and off at the wall depending on if the PC is on.


  • Not sure exactly how good this would work for your use case of all traffic, but I use autossh and ssh reverse tunneling to forward a few local ports/services from my local machine to my VPS, where I can then proxy those ports in nginx or apache on the VPS. It might take a bit of extra configuration to go this route, but it’s been reliable for years for me. Wireguard is probably the “newer, right way” to do what I’m doing, but personally I find using ssh tunnels a bit simpler to wrap my head around and manage.

    Technically wireguard would have a touch less latency, but most of the latency will be due to the round trip distance between you and your VPS and the difference in protocols is comparatively negligible.



  • I think that my skepticism and desire to have docker get out of my way, has more to do with already knowing the underlying mechanics, being used to managing services before docker was a thing, and then docker coming along and saying “just learn docker instead.” Which is fine, if it didn’t mean not only an entire shift from what I already know, but a separation from it, with extra networking and docker configuration to fuss with. If I wasn’t already used to managing servers pre-docker, then yeah, I totally get it.


  • That’s a big reason I actively avoid docker on my servers, I don’t like running a dozen instances of my database software, and considering how much work it would take to go through and configure each docker container to use an external database, to me it’s just as easy to learn to configure each piece of software for yourself and know what’s going on under the hood, rather than relying on a bunch of defaults made by whoever made the docker image.

    I hope a good amount of my issues with docker have been solved since I last seriously tried to use docker (which was back when they were literally giving away free tee shirts to get people to try it). But the times I’ve peeked at it since, to me it seems that docker gets in the way more often than it solves problems.

    I don’t mean to yuck other people’s yum though, so if you like docker, and it works for you, don’t let me stop you from enjoying it. I just can’t justify the overhead for myself (both at the system resource level, and personal time level of inserting an additional layer of configuration between me and my software).






  • I’ve dabbled with some monitoring tools in the past, but never really stuck with anything proper for very long. I usually notice issues myself. I self-host my own custom new-tab page that I use across all my devices and between that, Nextcloud clients, and my home-assistant reverse proxy on the same vps, when I do have unexpected downtime, I usually notice within a few minutes.

    Other than that I run fail2ban, and have my vps configured to send me a text message/notification whenever someone successfully logs in to a shell via ssh, just in case.

    Based on the logs over the years, most bots that try to login try with usernames like admin or root, I have root login disabled for ssh, and the one account that can be used over ssh has a non-obvious username that would also have to be guessed before an attacker could even try passwords, and fail2ban does a good job of blocking ips that fail after a few tries.

    If I used containers, I would probably want a way to monitor them, but I personally dislike containers (for myself, I’m not here to “yuck” anyone’s “yum”) and deliberately avoid them.



  • Not saying there aren’t any benefits to docker, migration to a different host distro and dependency conflicts are the big two. But for me they are kinda the only two, I find for what I do it’s just as easy to write a shell script that downloads and unpacks software, and copies my own config files into place than it is to deal with basically doing the same thing, but with docker. I could use ansible or something similar for that, but for me, shell scripts are easier to manage.

    Don’t get me wrong, docker has its place. I just find that it gets in my way with it’s own quirks almost as much as it helps in other areas, especially for web apps like Nextcloud that are already just a single folder under the web root and a database.

    One additional benefit I get from not using docker, is that I can do more with a lower-powered server, since I’m not running multiple instances of PHP and nginx across multiple containers.


  • More recently, they’ve also said they are happy to work with any manufacturers who want Steam OS for their devices, and encouraged them to reach out to Valve and they’ll work together to get things working.

    My assumption is that the real reason we aren’t seeing more movement on it, is that a lot of the other handheld makers are using full Windows compatibility as a marketing point, and I also doubt Valve is going to help get other game stores working on Steam OS. It’s not that weird if Valve makes a handheld that only officially supports Steam, but it might seem a little weird if Asus or Lenovo released a handheld that only officially supported buying games on Steam.

    I suppose a device that could dual boot Windows and Steam OS could be a solution, but knowing how Windows updates can sometimes randomly bork dual boot setups, I could see that being a potential problem as well.

    Ideally Epic, Humble Games, GOG, and the remaining publishers that have made their own stores and launchers should get their games and launchers working under Linux/Proton, but I think the current plan is for hardware companies to make a bunch of mediocre Windows handhelds, and hope Microsoft gets off their ass and makes a version of Windows tailored for the form factor. But there really isn’t much motivation for Microsoft to do that, because they’d probably rather just launch their own Xbox handheld and keep that for their own gaming walled garden. Gamepass already gets them onto pretty much every other platform, so why should they care?


  • I’ve been self-hosting since before docker and containers were a thing, and even though Nextcloud kinda pushes their container images these days, I still refuse to use them, and use the community archive releases or web installer when reconfiguring my system or setting up a new system to migrate to. Maybe it’s just Nextcloud and the other software I use, or maybe it’s just that I’m not really trying to build scalable server infrastructure with a lot of users, but I generally find that docker causes more problems than it solves, and it does my head in when I see projects that recommend containers as the primary suggested install method.

    Totally agree with your assessment of the plugins/apps systems. Feels like you need to stick to official “apps” and hope they don’t get abandoned to have anything close to a good experience because even minor updates can break all the 3rd party apps because of a compatibility check, where you end up waiting for the app developer to release an “update” that only changes the version compatibility number.





  • So, my solution to this is probably a bit more jank than you’d like, but I use one of those portable USB-C monitors that also has an HDMI input (mine is mini HDMI, but I just leave the adapter plugged into it) then I have HDMI cables run from the back of my systems, that I can plug into the portable monitor when I need local access.

    When not in use, I can easily just store the portable monitor nearby, and it doesn’t take up much space.

    I’m running full desktops on my machines though, and also have those dummy HDMI plugs that emulate a monitor so the systems load the desktop properly, so I can use vnc or other remote desktop to access the systems when I’m not home. I plug the dummy HDMI plugs into same HDMI cables I use for my display, using couplers, so I don’t run into any weird “dual monitor” weirdness from using both a real display and a dummy plug at the same time.

    As far as input, I have a couple of super cheap wireless keyboards with trackpads built in. They don’t provide the best typing experience, but they do the job for what I need.