• 0 Posts
  • 10 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle



  • It’s not a simple task, so I won’t list many specifics, but more general principles.

    First, some specifics:

    • disable remote root login via ssh.
    • disable password login, and only permit ssh keys.
    • run fail2ban to lock people out automatically.

    Generally:

    • only expose things you must expose. It’s better to do things right and secure than easy. Exposing a webservice requires you to expose port 443 (https). Basically everything else is optional.
    • enable every security system that you don’t have reason to disable. Selinux giving you problems? Don’t turn it off, learn how to write rules to let your application do the specific things it needs. Only make firewall exceptions where needed, rather than disabling the firewall.
    • give system users the minimum access they require to function.
    • set folder permissions as restrictively as possible. FACLs will help, because it lets you be much more nuanced.
    • automatic updates. If you have to remember to do it, it won’t happen. Failure to automate updates means your software is out of date.
    • consider setting up a dedicated authentication setup like authellia or keycloak. Applications tend to, frankly, suck at security. It’s not what they’re making so it’s not as good as a dedicated security service. There are other follow on benefits.
    • if it supports two factor, enable it.

    You mentioned using cloud flare, which is good. You might also consider configuring your firewall to disallow outbound connections to your local network. That way if your server gets owned, they can’t poke other things on your network.





  • So, you’re going to run into some difficulties because a lot of what you’re dealing with is, I think, specific to casaOS, which makes it harder to know what’s actually happening.

    The way you’ve phrased the question makes it seem like you’re following a more conventional path.

    It sounds like maybe you’ve configured your public traffic to route to the nginx proxy manager interface instead of to nginx itself.
    Instead of having your router send traffic on 80/443 to 81, try having it send the traffic to 80/443, which should be being listened to by nginx.

    Systems that promise to manage everything for you are great for getting started fast, but they have the unfortunate side effect of making it so you don’t actually know what it’s doing, or what you have running to manage everything. It can make asking for help a lot harder.


  • You’ll be fine enough as long as you enable MFA on your Nas, and ideally configure it so that anything “fun”, like administrative controls or remote access, are only available on the local network.

    Synology has sensible defaults for security, for the most part. Make sure you have automated updates enabled, even for minor updates, and ensure it’s configured to block multiple failed login attempts.

    You’re probably not going to get hackerman poking at your stuff, but you’ll get bots trying to ssh in, and login to the WordPress admin console, even if you’re not using WordPress.

    A good rule of thumb for securing computers is to minimize access/privilege/connectivity.
    Lock everything down as far as you can, turn off everything that makes it possible to access it, and enable every tool for keeping people out or dissuading attackers.
    Now you can enable port 443 on your Nas to be publicly available, and only that port because you don’t need anything else.
    You can enable your router to forward only port 443 to your Nas.

    It feels silly to say, but sometimes people think “my firewall is getting in the way, I’ll turn it off”, or “this one user needs read access to one file, so I’ll give read/write/execute privileges to every user in the system to this folder and every subfolder”.

    So as long as you’re basically sensible and use the tools available, you should be fine.
    You’ll still poop a little the first time you see that 800 bots tried to break in. Just remember that they’re doing that now, there’s just nothing listening to write down that they tried.

    However, the person who suggested putting cloudflare in front of GitHub pages and using something like Hugo is a great example of “opening as few holes as possible”, and “using the tools available”.
    It’s what I do for my static sites, like my recipes and stuff.
    You can get a GitHub action configured that’ll compile the site and deploy it whenever a commit happens, which is nice.


  • Vpns and casting create a complicated network situation sometimes. Without being able to see exactly what was going on, it can be difficult to tell what’s happening.

    Sometimes casting involves sending the data from the controller, your laptop, to the renderer, your TV. That means you laptop pulls the data down and then forwards it.
    Sometimes it involves telling the renderer how to get the data so that it can pull it down and play it.

    When you use a VPN, you’re sending your traffic through a tunnel so that it’s “outside” your local network. There will be some exceptions for certain local behavior that needs to be local, unless you configured it not to but you probably didn’t.

    It’s honestly curious that it ever works, since the VPN should make it so you’re basically “not there”, and so casting shouldn’t be possible.

    My recommendation would be to use Plex if it works.
    Download on the VPN and then drop off to cast it is an alternative.

    There’s enough moving parts that you’re not going to have any fun figuring it out, and the answer will probably be something you can’t fix.