• 1 Post
  • 13 Comments
Joined 1 year ago
cake
Cake day: July 10th, 2023

help-circle



  • For free tier, Google Cloud is more transparent about what you get than AWS IMO.

    The only catch is to make sure your persistent disk is “standard” to make it totally free as it defaults to SSD.

    However if you do mess up the disk you’ll still only be paying $1-2/mo. Been using GC for years, and recently they finally started offering dual stack so you can do your own 6to4 tunneling or translation if you want, depends on your usage case.

    AirVPN also are legit and will let you forward ports to expose your local services if you’re worried about DMCA type issues.

    I finally got IPv6 here through Starlink, it’s nice to have full access to the internet again after a decade behind CGNAT




  • We’re talking about replacing lost content here though. And as such you can use the streaming services as a “backup” by re-ripping your whole collection if you lose it.

    I’m actually doing this now as part of a library cleanup. Zotify + beets are a great combo to pull down vast quantities of music and properly sort and tag it.

    Then I stream it to my phone in my truck using ampache and ultrasonic, which does have a local buffering option.

    However if you have some exotics that you ripped from rare discs, demos or prerelease, live recordings with sentimental value etc. I would suggest keeping those properly backed up. I don’t have many of these, but the ones I do have are backed up both cloud and offsite.



  • You can download from Spotify using Zotify. Albums, playlists, if you set it to Artist unfortunately you will get a bunch of singles and EPs that you have to clean up.

    If you have Premium you can download at high bitrates, otherwise you get Ogg Vorbis at around 150 ABR. You can automatically transcode to whatever format you want, then I feed it to beets to catalogue and deliver it with Ampache.

    I like the moderate bitrate OGGs myself, as I often stream from Ampache to my phone and our mobile service is quite slow. So this system works great for me.


  • As the other commenter said, it’s all about depth of discharge. A 10kWh Lifepo4 bank gets you almost 10kWh every time while you should treat a 10kWh lead-acid bank as if it was a 2kWh bank for any sort of decent life, with deep discharges being limited to emergency situations.

    All lithium chemistries are practically maintenance free while you are probably familiar with water level monitoring and equalization of lead acid.

    Note that all site built lithium banks MUST have a balance mechanism as this is their “automated maintenance”. Without balancing on every charge, lithium cells will be rapidly destroyed.


  • “Deep cycle” batteries are the best of the lead-acids for the task. But they are still obsolete and you should source lithium if at all practical.

    However if power interruptions are short, loads are low or you have an external power source like solar or wind, inferior batteries can do the job.

    I use a bunch of old car batteries at my house for my battery bank. It’s more of a big capacitor, but it’s almost always sunny here and kW of solar are pouring in.

    My critical equipment i.e. starlink, home and farm automation and monitoring, cell booster and HMI/SCADA only take a couple hundred watts, so no big deal. Most of the solar power goes to keeping the freezers cold.



  • Honestly I do all my IoT stuff in plain code, it’s actually simpler IMO than trying to use a graphical functional block type interface like NodeRed. And it’s a good way for you to get into coding in a way that you can work with real systems in a fairly safe way.

    Check out Python’s MQTT library, you can build an event driven MQTT handler pretty easily. You set a list of topics you want to subscribe to and then when a message arrives it will call the message handling function. You can check the topic/payload and act on it as you want, publish other messages or perform other operations.

    I like distributed control systems myself where individual nodes subscribe to each other and communicate directly (through the MQTT broker) when possible, plus a couple Python scripts running on the broker system to coordinate operations that can’t be easily managed that way.

    For example an “sundown” topic can be published by a light sensor in the evening, and then either individual lights can subscribe to it and respond, or a script subscribes and iterates through a list of all the lights that are supposed to be on and sends them a power on command. The first option works with custom built endpoints, the second works to integrate Tasmota or similar where several different node devices may exist with different command schema.