To be fair, resolution is not enough to measure quality. The bitrate plays a huge role. You can have a high resolution video looking worse than a lower resolution one if the lower one has a higher bitrate. In general, many videos online claim to be 1080p but still look like garbage because of the low bitrate (e.g. like on YouTube or so). If you go for a high bitrate video, you should be able to tell pretty easily, the hair, the fabric, the skin details, the grass, everything can be noticeably sharper and crisper.
Edit: so yeah, I agree with you, because often they are both of low bitrate…
Exactly, this is about compression. Just imagine a full HD image, 1920x1080, with 8 bits of colors for each of the 3 RGB channels. That would lead to 1920x1080x8x3 = 49 766 400 bits, or roughly 50Mb (or roughly 6MB). This is uncompressed. Now imagine a video, at 24 frames per second (typical for movies), that’s almost 1200 Mb/second. For a 1h30 movie, that would be an immense amount of storage, just compute it :)
To solve this, movies are compressed (encoded). There are two types, lossless (where the information is exact and no quality loss is resulted) and lossy (where quality is degraded). It is common to use lossy compression because it is what leads to the most storage savings. For a given compression algorithms, the less bandwidth you allow the algorithm, the more it has to sacrifice video quality to meet your requirements. And this is what bitrate is referring to.
Of note: different compression algorithms are more or less effective at storing data within the same file size. AV1 for instance, will allow for significantly higher video quality than h264, at the same file size (or bitrate).