time for a re-watch. it gets better with time.
and by better I mean the existential dread is worse.
poop
time for a re-watch. it gets better with time.
and by better I mean the existential dread is worse.
Sure, but something tells me the kinds of people who use software like Wondershare and Aiseesoft video converters arent going to be writing their own FFMPEG automations in batch files or bash scripts.
Basically everything worth using is just a wrapper for FFMPEG these days, so they all perform the same, just with different interfaces. So Handbrake will always be the go-to for the basics, but if you are looking for automation and custom processing based on rules you set out, then FileFlows is worth playing with
its been on the experimental branch for a while now
These are the people that complain to their ISP when their game ‘lags’ on their wireless connected computer several rooms away from the router.
can you run something like iperf3 or openspeedtest between the server and client to prove its a network throughput issue?
do you have a network switch you can add to avoid switching through your router (if it is indeed bad?)
Have you ensured you arent unknowingly using wifi at either end?
NGINX is a bit more hands on than some other options but it’s mature, configurable and there’s a huge amount of information out there for setting it up for various use cases.
in my case, its what I set up when i was first getting into this and it works, so I don’t want to go through setting up anything else.
Thanks for the insightful and helpful comment.
Unraid is great and I have been using it for over a decade now, but a paid OS on a 2bay nas seems excessive
I cant say I care as much as I used to, since encoding has gotten quite good, but I have also gotten better at seeing (aka. worse at being distracted by) compression artifacts so while I am less of a perfect remux rip supremacist, I’m also more sensitive to bad encodes so its a double edged sword.
I still seek out the highest quality versions of things that I personally care about, but I don’t seek those out for absolutely everything like I used to. I recently saved 12TB running a slight compression pass on my non-4k movie library, turning (for example) a 30gb 1080p Bluray Remux into a 20gb H265 high bitrate encode, which made more room for more full fat 4K bluray files for things I care about, and the few 1080p full remuxes I want to keep for rarities and things that arent as good from the 4k releases or the ones where the 4k release was drastically different (like the LOTR 4k’s having poor dynamic range and the colours being changed for the Matrix etc), which I may encode in the future to save more space again. I know I can compress an 80gb UHD bluray file down to 60gb with zero noticeable loss, thats as far as I need to go, I don’t need to go down to 10gigs like some release groups try to do, and at that level of compression you might as well be at 1080p.
I cant go as low as a low bitrate 720p movie these days as I’m very close to a large screen so they tend to look quite poor, soft edges, banded gradients, motion artifacts, poor sound etc. but if I were on a smaller screen or watching movies on a phone like I used to, I probably wouldn’t care as much.
Another side to my choice to compress is that I have about 10 active Plex clients at the moment and previously they were mostly getting transcoded feeds (mostly from remux sources) but now most of them are getting a better quality encode (slow CPU encode VS fast GPU stream) direct to their screens, so while I’ve compressed a decent chunk of the library, my clients are getting better quality feeds from it.
I use Plexamp for that, Jellyfin does it too. You can assign libraries per user quite easily.
So for 3 users you might have 4 libraries, one per user then a shared library they all have access to.
I have complete ROM sets for a couple of platforms in my archive, they’re available on SLSK but not a huge amount of bandwidth available.
Sad to see the old giants like Vimms finally being attacked after all these years.
the 2.5" size of disks are now mostly direct USB controller disks rather than sata adapters internally.
3.5" disks are still SATA as far as i’ve seen but the actual sku’s of the disks are often the lower grades. like you will get a disk that looks like another good disk but with only 64mb of dram instead of 256 on the one you would buy as a bare internal drive for example so they can end up a bit slower. and warranties are usually void.
Used to be my main source of disks, but these days there are better ways and it is easier to know exactly what you are getting.
Are you transcoding?
4mbit per client for 1080 is generally a workable minimum for the average casual watcher if you have H265 compatible clients (and a decent encoder, like a modern intel CPU for example), 6 - 8mbit per client if its H264 only.
Remember that the bitrate to quality curve for live transcoding isn’t as good as a slow, non-real-time encode done the brute force way on a CPU. so if you have a few videos that look great at 4mbit, dont assume your own transcodes will look quite that nice, you’re using a GPU to get it done as quickly as possible, with acceptable quality, not as slowly and carefully as possible for the best compression.
You’re confusing a container format (MKV) with a video codec (AV1)
MKV is just a container like a folder or zip file that contains the video stream (or streams, technically you can have multiple) which could be in H264, H265, AV1 etc etc, along with audio streams, subtitles and many other files that go along, like custom Fonts, Posters, etc etc.
As for the codec itself, AV1 done properly is a very good codec but to be visually lossless it isn’t significantly better than a good H265 encode without doing painfully slow CPU encodes, rather than fast efficient GPU encodes. people that are compressing their entire libraries to AV1 are sacrificing a small amount of quality, and some people are more sensitive to its flaws than others. in my case I try to avoid re-encoding in general. AV1 is also less supported on TVs and Media players, so you run into issues with some devices not playing them at all, or having to use CPU decoding.
So I still have my media in mostly untouched original formats, some of my old movie archives and things that aren’t critical like daily shows are H265 encoded for a bit of space saving without risking compatibility issues. Most of my important media and movies are not re-encoded at all, if I rip a bluray I store the video stream that was on the disk untouched.
N5095 ? lots of reports of that one not supporting everything it should based on other Jasper Lake chips, CPU getting hit for Decode when it shouldn’t for example. Also HDR to SDR cant be accelerated with VPP on that one as far as I know so the CPU gets smashed. I think you can do it with OpenCL though.
Was it an n100? They have a severely limited power budget of 6w compared to the n95 at 25w or so.
I’m running jellyfin ontop of ubuntu desktop while also playing retro games. That all sits in a proxmox vm with other services running alongside it. It’s perfectly snappy.
One of my miniPCs is just a little N95 and it can easily transcode 4K HDR to 1080p (HDR or tonemapped SDR) to a couple of clients, and with excellent image quality. You could build a nice little server with a modern i3 and 16gigs of ram and it would smash through 4 or 5 high bitrate 4K HDR transcodes just fine.
Is that one transcoding client local to you? or are you trying to stream over the web? if it’s local, put some of the budget to a new player for that screen perhaps?
Jellyfin has come a hell of a long way since it first forked from Emby. sure its not as feature complete and polished as plex but it’s far from shit, and it’s free and open.
I run both side by side with several clients on each and have been a plexpass holder since 2013.