Yeah honestly either solution is a solid one
Formerly /u/neoKushan on reddit
Yeah honestly either solution is a solid one
The guy above you gives great advice. Set up SWAG, then the only ports you’re exposing are 443.
Once you have that set up, look at adding something like authelia. This will give you 2FA on top of those apps meaning even if someone guesses the password and the URL to access them, they still won’t be able to.
I appreciate what this project is doing. I’ve already got my setup configured using the trash guides, with recyclarr pulling in the latest config data for it. Is there a benefit to switching to Dictionarry, anyone know?
tachiyomi
Free and open source manga reader for Android.
(For those wondering what this discussion is about)
That "traffic between two IP addresse"s is enough reason to use a VPN you trust.
Put it this way, bit torrent traffic can be encrypted and routed over standard ports to make it look like regular web traffic, so still “just traffic between two IP addresses” but you wouldn’t run that without a VPN, would you?
The rights to search sure are, but it’s more like Google happens to be the one paying it right now. It could be Microsoft or Yahoo or anyone.
Mozilla definitely needs to diversity better here, but the implication that they’re “funded by google” is completely misleading.
I dont know much about the primary developers of Lemmy,
With respect, maybe you shouldn’t be commenting on what’s going on behind the scenes. They are good developers but they’re not good leaders or shepherds of such a big project. They need to hand over stewardship to someone that can be trusted.
Google pays them to be the default search engine, they’re not funded by Google.
We desperately need a company like Mozilla to take the reigns of something like Lemmy. The original developers are far too biased and short sighted to see the bigger picture, it needs to be an independent group that promotes more open source development.
I think they misheard them and they said “amazing piracy threat”.
I think those seed boxes you mentioned are the main reason OP isn’t using all their bandwidth. In the same way you suggest limiting total connections, those downloading will also have a limited number of connections so of course you’ll prioritise those on a gigabit+ uplink than those on slower links.
It all adds up and it all helps, of course.
If you read up through the thread, the person I responded to specifically said about transmission being the easiest to run via docker.
Sabnzbd is probably the best choice of download client, fyi.
If you’re trying to build it all from scratch, sure, but you specifically mentioned docker and there’s plenty of high-quality docker images you can use - and it’s no harder to use a qBittorrent docker image than a transmission docker image.
Here’s the docker command for transmission:
docker run -d \
--name=transmission \
-e PUID=1000 \
-e PGID=1000 \
-e TZ=Etc/UTC \
-e TRANSMISSION_WEB_HOME= `#optional` \
-e USER= `#optional` \
-e PASS= `#optional` \
-e WHITELIST= `#optional` \
-e PEERPORT= `#optional` \
-e HOST_WHITELIST= `#optional` \
-p 9091:9091 \
-p 51413:51413 \
-p 51413:51413/udp \
-v /path/to/data:/config \
-v /path/to/downloads:/downloads \
-v /path/to/watch/folder:/watch \
--restart unless-stopped \
lscr.io/linuxserver/transmission:latest
and the equivelant for qBitTorrent:
docker run -d \
--name=qbittorrent \
-e PUID=1000 \
-e PGID=1000 \
-e TZ=Etc/UTC \
-e WEBUI_PORT=8080 \
-p 8080:8080 \
-p 6881:6881 \
-p 6881:6881/udp \
-v /path/to/appdata/config:/config \
-v /path/to/downloads:/downloads \
--restart unless-stopped \
lscr.io/linuxserver/qbittorrent:latest
I’m not even going to argue that the qBitTorrent docker image is technically easier as it has less to configure, it’s all one command at the end of the day.
Qbitorrent, rtorrent and deluge can be run via docker with a web interface.
Thanks for the context of this! I agree, they had no good options there.
Wait, what’s this about the company?
I understand your reasoning for not setting up the other *arr apps, due to not having a dedicated server to run them, however you’d still benefit from running them on your PC. They handle the downloading, extraction, categorising and naming of the media you want and they can do that automatically.
Even on your computer, that’ll save you time and effort, you can just tell it what shows you want - even shows that aren’t out yet and it’ll grab them for you whenever they appear. It’s great for when you enjoy a show and the next season starts, it just grabs it for you and the show appears one day.
A lot of people start this way and it’s only then they think about getting a dedicated device for it - such a device can be a decent little Synology or QNAP NAS, something small, quiet and power efficient but I’d definitely say you don’t need to start there. It’s worth the effort to try though, believe me.
Another recommendation for tdarr, set it up in January and let it transcode away, going to h265 for all my media - saved me over 40TB of space so far and I haven’t noticed a massive drop In quality or had any playback issues.