What are you self-hosting?

+1 … what already made this thread very interesting to me is Code-Server for VScode remote dev (Thanks @ILTPWC )

2 Likes

doing a lot of this myself :+1:

1 Like

i see you are an ESP convert :slight_smile:
I use a couple spread around the house to control relays and ‘stuff’ like my pool pump/chlorinator setup.
To tie them to MQTT and OpenHAB I use a pretty unknown IOT framework called Souliss

It was designed to run on arduinos and low resource/low power hardware but it can implement crazy networks (semi transparently) with multiple nodes running over RS485 and only a gateway node needing ethernet/wifi to act as a main controller to networked applications …

2 Likes

@ILTPWC Tubesync is awesome. I just deployed it to my home server and am backing up a couple YT channels I watch regularly. Thanks for listing it here so I knew about it!

3 Likes

@ucav117
yt-dlp supports rss feeds and you can just have a simple cronjob without anything else…

1 Like

I’ll check it. I’m trying to figure out a way to have stuff get downloaded into folders that my Jellyfin can index and play. Also rip audio only for my Audiobookshelf so I can listen to podcasts that are done on YouTube. And have it automated so I can just point it at a channel and then forget about it.

I’ve got 2 sites if you will, one at my parents and the second being my appartement.

Site1:

  • Opnsense

    • WireGuard
    • FRR (BGP)
    • Unbound (Split-Horizon DNS)
    • Lots of VLANs and Geo-IP Blocking
  • Proxmox

    • (Auto-) Restic (Backups)
    • Bind (DNS)
    • PiHole
    • Traefik
    • Tandoor
    • Photoprism
    • Nextcloud
    • Minecraft Modpack (sometimes when I feel like it)
    • Homer
    • Jellyfin
    • Plex
    • Vaultwarden
    • VyOS (for DN42 routing)
    • Scrutiny
    • Netbox
    • GNS3
    • SAMBA Share
    • SAMBA Active Directory
    • Keycloak
  • Pi

    • TVHeadend with SAT Tuner
    • OSCAM
    • Uptime Kuma
  • vServer

    • Mailcow

Site 2:

  • Linux Router (RHEL9 ARM+ Ansible Playbooks)

    • Nftables
    • Kea DHCP
    • WireGuard
    • Blocky DNS
    • Many VLANs
  • OpenStack

    • Yoga Release
    • All basic services
    • Octavia
    • Linstor SAN
    • OVN
  • Services on OpenStack

    • Jellyfin
    • Production Kuberntes Cluster (CAPI)
    • Dev-K8s Cluster
    • Docker Swarm with Nextcloud (for Debugging a friend’s setup)
    • Many more testing setups
2 Likes
cd folder
yt-dlp -f the-format yt-link

(generally 247+251 is the fastest for me, that is 720p and the audio codec associated - use whatever format you get by querying yt-dlp -F yt-link)

yt-dlp --best-audio yt-link

(if that doesn’t work, just use -F to find the audio only containers and use -f 251 for example to download the audio only - use --write-description flag to get the description, certain podcasts launch with timestamps)

Lol bilky I understand how to use yt-dl in the command line. I just don’t want to ssh into my server every time I want to add a channel.

It used to be more complicated to download channels, involving the channelid (which was easy to find from any video of a channel → page source → search for channelid). Nowadays it’s as easy as

cd channel-folder
yt-dlp "https://www.youtube.com/@Level1Techs/videos"

This will download the entire L1T channel with the best quality. You may want to define your quality (generally I use -f 247+251 before a link, for 720 and audio, but you can use other container types by querying any video with -F link).

I used to download more stuff, but for the past 2 or 3 years I’ve been using yt-dl(p), it was an absolute necessity to check my feed and select what I watch, especially more true nowadays when I have even less time to spare. I would not be able to download entire channels (I don’t have the space for that either anyway).

You can download a playlist of a channel instead, using the same options, but you have to keep the video or audio files even after watching if you want to have yt-dlp do its thing without having to worry about it. If space is a concern, there should be a way to automate RSS gathering with newsboat and querying its sqlite DB when there’s updates on a certain feed, to automatically download the freshly downloaded feeds, that being new videos to watch or listen to.

I just don’t know how. It probably be something like:

sqlite3 cache.db "select * from rss_item where 'idk, text like %youtube% or something'" | xargs - yt-dlp -f 247+251

Something like that, but if someone has the knowledge of sqlite querying, please help us, might get us some fancy newsboat rss automation.

1 Like

The tagging system is really really helpful. Other than that, it’s nothing special.

You can do full text search as well, so I guess there’s that.

2 Likes

For me, that all run on a very old hardware (socket 1155… just replaced the i3-2120T by a E3-1275 V2 a couple of years back. 16GB RAM, LSI HBA, with 7 HDD and 2 SATA SSD). After a few years under Ubuntu, I got fed up with the RAID management (I was using mdadm) and switched to Unraid for the ability to mix and match different disk sizes (yes, I know it is doable in Ubuntu too but I did not know at that time). I have been happy with that so far.

Running SMB from Unraid then and a couple of Docker containers :

  • Plex and Jellyfin (been using Plex for many years, just added Jellyfin as a test/backup)
  • Sonarr,Radarr, Jackett
  • Transmission
  • Unifi Controller
  • Wireguard VPN (just added this week)
  • Crashplan-pro management Docker for my cloud backup
  • Home Assistant (with a Mosquitto broker)
  • Nextcloud
  • Pihole
  • Vaultwarden
  • Swag (reverse proxy) to access some of those services from outside

All Dockers have persistent data that is cloud backed up on Crashplan.

Thanks for this thread, it made me discover some stuff :slight_smile:

2 Likes

I tend to prefer to keep them separate, that prevent the issue that one service upgrade to the newer Db version but another one force you to hold back…
Actually for that reason, I prefer the self-contained container rather than the stacks.

1 Like

Yeah, they’re all running in separate containers. They’re usually from stacks applications need. I hate using the local DB instance inside the applications.

That was also one point I was thinking about regarding separate or consolidated DBs. I’m gonna keep them separate, I don’t think I’ll see much performance gains if I went for one instead.

TrueNAS Core here (5 bare metal installs), doing:

Samba
SyncThing (Plugin)
Tailscale (in VM)
Plex (plugin)
Windows backup target
Replication

And a Raspberry Pi doing:
Pi-hole

I did have home assistant running on another Pi, but it saw little use so I’ve put it aside. Tempted to sell, but I might have another use for it some day.

I’d quite like to play with xcp-ng, but I really need more purpose for it.

1 Like

At that point, all that is needed to do is just dump the DB, then restore to either the same version and upgrade, or restore to an already upgraded instance. Not a big deal, but you need to change the DB config for your programs (which, if you are smart, you are using DNS CNAMEs to point to the FQDN of another DB and when time comes to upgrade, just point the CNAME to the new instance and pretend that nothing happened).

Checkout Invidious. It’s an alternative webUI for Youtube that is self-hosted.

Self-hosting:

HAOS (home assistant)
Invidious
Xpenology(Synology but on custom hardware)
Proxmox

2 Likes

snapcast. It’s like an open source Sonos! I’ve got it connected to Spotify, AirPlay and Mopidy.