Just this guy, you know?

  • 0 Posts
  • 48 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle









  • That’s a goal, but it’s hardly the only goal.

    My goal is to get a synthesis of search results across multiple engines while eliminating tracking URLs and other garbage. In short it’s a better UX for me first and foremost, and self-hosting allows me to customize that experience and also own uptime/availability. Privacy (through elimination of cookies and browser fingerprinting) is just a convenient side effect.

    That said, on the topic of privacy, it’s absolutely false to say that by self-hosting you get the same effect as using the engines directly. Intermediating my access to those search engines means things like cookies and fingerprinting cannot be used to link my search history to my browsing activity.

    Furthermore, in my case I host SearX on a VPS that’s independent of my broadband connection which means even IP can’t be used to correlate my activity.



  • Just started finally playing through Manifold Garden and after switching to kb&m emulation with the stick and touchpad, plus some tweaks, it’s played extremely well.

    Meanwhile I’ve also put a distressing amount of time into Rimworld, which works great with the Deck’s suspend mode since it’s very amenable to dropping in, doing stuff for a few minutes, then dropping out. Well except that a few minutes always turns into a few hours…



  • Your first two paragraphs make the picture worse, not better.

    As for your last, I’m not writing an economics thesis. It was a quick analysis to illustrate a problem no sane person disputes: streaming services have substantially driven down revenue for artists, to the point that for many it’s genuinely impossible to create their art while making a living wage.

    Is it better than piracy? Sure. At least the artists are getting something (well, unless you drop below Spotify’s streaming cutoff, in which case you can get fucked). But it’s still a shitty deal and gives consumers someone else to blame as artists slowly bleed out.






  • Honestly the issue here may be a lack of familiarity with how bare repos work? If that’s right, it could be worth experimenting with them if only to learn something new and fun, even if you never plan to use them. If anything it’s a good way to learn about git internals!

    Anyway, apologies for the pissy coda at the end, I’ve deleted it as it was unnecessary. Keep on having fun!


  • No. It’s strictly more complexity.

    Right now I have a NAS. I have to upgrade and maintain my NAS. That’s table stakes already. But that alone is sufficient to use bare git repos.

    If I add Gitea or whatever, I have to maintain my NAS, and a container running some additional software, and some sort of web proxy to access it. And in a disaster recovery scenario I’m now no longer just restoring some files on disk, I have to rebuild an entire service, restore it’s config and whatever backing store it uses, etc.

    Even if you don’t already have a NAS, setting up a server with some storage running SSH is already necessary before you layer in an additional service like Gitea, whereas it’s all you need to store and interact with bare git repos. Put the other way, Gitea (for example) requires me to deploy all the things I need to host bare repos plus a bunch of addition complexity. It’s a strict (and non-trivial) superset.