Skip Navigation
Selfhosted

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
  2. No spam posting.
  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
  6. No trolling.

Resources:

Members
45,999
Posts
4,320
Active Today
545
Created
2 yr. ago
  • Selfhosted @lemmy.world

    Looking to start self hosting by going through Louis Rossman's recently released guide. Any pointers for a newbie are most welcome.

  • Selfhosted @lemmy.world

    Welcome to [email protected] - What do you selfhost?

  • Selfhosted @lemmy.world
    Otter @lemmy.ca

    Synology's telegraphed moves toward a contained ecosystem and seemingly vertical integration are certain to rankle some of its biggest fans, who likely enjoy doing their own system building, shopping, and assembly for the perfect amount of storage. "Pro-sumers," homelab enthusiasts, and those with just a lot of stuff to store at home, or in a small business, previously had a good reason to buy one Synology device every so many years, then stick into them whatever drives they happened to have or acquired at their desired prices. Synology's stated needs for efficient support of drive arrays may be more defensible at the enterprise level, but as it gets closer to the home level, it suggests a different kind of optimization.

  • Selfhosted @lemmy.world
    iAmTheTot @sh.itjust.works

    Need help getting domain to resolve over LAN

    Hey all. I'm hosting a Docmost server for myself and some friends. Now, before everyone shouts "VPN!" at me, I specifically want help with this problem. Think of it as a learning experience.

    The problem I have is that the Docmost server is accessible over internet and everyone can log on and use it, it's working fine. But when I try to access over LAN, it won't let me log in and I am 99% sure it's related to SSL certs over LAN from what I've read.

    Here's the point I've gotten to with my own reading on this and I'm just stumped now:

    I've got an UNRAID server hosted at 192.186.1.80 - on this server, there's a number of services running in docker containers. One of these services is Nginx Proxy Manager and it handles all my reverse proxying. This is all working correctly.

    I could not for the life of me get Docmost working as a docker container on UNRAID, so instead I spun up a VM and installed it on there. That's hosted at 192.168.1.85 and NPM points to it when you try to access

  • Selfhosted @lemmy.world
    irmadlad @lemmy.world

    Question About Watchtower

    So, I finally installed Watchtower to auto update my containers. I was a little hesitant because just letting apps auto update kind of makes me a little nervous. Even Windows updates give me bouts of trepidation. Everything went well, there was a little hiccup with Netdata but resolved in less than 5 minutes.

    My question is that there are four remaining containers that haven't been updated: Speedtest Tracker, Portainer, Doppler Task, and Dockge.

     undefined
        
    2025-04-19T06:00:46.510622594Z INFO[38092] Session done                                  Failed=0 Scanned=48 Updated=0 notify=no
    2025-04-19T08:00:46.040690535Z INFO[45292] Session done                                  Failed=0 Scanned=48 Updated=0 notify=no
    2025-04-19T10:00:45.952863778Z INFO[52492] Session done                                  Failed=0 Scanned=48 Updated=0 notify=no
    2025-04-19T12:00:47.755915129Z INFO[59694] Session done                                  Failed=0 Scanned=48 Updated=0 notify=no
    2025-04-19T14:00:50.0464984
      
  • Selfhosted @lemmy.world
    InternetCitizen2 @lemmy.world

    Syncing podcasts / rss over nextcloud

    I have a server on my local network. So far I have it sync my laptop and phone's Joplin and Calendar when I get home. I was thinking it would be neat to host my rss/podcast feeds on my nextcloud so that the read/listend to passes around.

    Is nextcloud generally the way to go with this or has the community gone some other path?

    Clients are a linux laptop and pixel with graphene if it matters.

  • Selfhosted @lemmy.world
    gedaliyah @lemmy.world

    SMS/MMS backup and sync?

    In my journey to self hosting and Degoogling, one thing I've missed is being able to access my phone on a computer. Is there a self hosted solution that allows syncing between text messaging and a PC/web interface?

    I don't necessarily need a sophisticated Features like customer management or automation. I just want to access my messages from another device, and of course have a server based backup. The ability to reply to messages from the computer is a plus but not necessary. Is there a good option for this?

  • Selfhosted @lemmy.world
    Otter Raft @lemmy.ca

    Endurain is a self-hosted fitness tracking service designed to give users full control over their data and hosting environment

    You can find screenshots on this page: https://docs.endurain.com/gallery/

  • Selfhosted @lemmy.world
    gwheel @lemm.ee

    Is it worth migrating docker apps to truenas scale community apps?

    I've got forgejo configured and running as a custom docker app, but I've noticed there's a community app available now. I like using the community apps when available since I can keep them updated more easily than having to check/update image tags.

    Making the switch would mean migrating from sqlite to postgres, plus some amount of file restructuring. It'll also tie my setup to truenas, which is a platform I like, but after being bit by truecharts I'm nervous about getting too attached to any platform.

    Has anyone made a similar migration and can give suggestions? All I know about the postgres config is where the data is stored, so I'm not even sure how I'd connect to import anything. Is there a better way to get notified about/apply container images for custom apps instead?

  • Selfhosted @lemmy.world
    Sips' @slrpnk.net

    Self-Hosted podcast has announced that episode 150 is their last.

    Unfortunate news for those of us who have been following this podcast, its been a very entertaining and educational podcast. Unfortunately it ends in three episodes. Here are the podcast details for those who want to hear about it - its at the beginning of the episode.


    Self-Hosted: 147: The Problem with Game Streaming

    Episode webpage: https://selfhosted.show/147

    Media file: https://aphid.fireside.fm/d/1437767933/7296e34a-2697-479a-adfb-ad32329dd0b0/431317f3-db02-48b3-a9c6-3cb43108daf9.mp3

  • Selfhosted @lemmy.world
    traches @sh.itjust.works

    Incremental backups to optical media: tar, dar, or something else?

    I'm working on a project to back up my family photos from TrueNas to Blu-Ray disks. I have other, more traditional backups based on restic and zfs send/receive, but I don't like the fact that I could delete every copy using only the mouse and keyboard from my main PC. I want something that can't be ransomwared and that I can't screw up once created.

    The dataset is currently about 2TB, and we're adding about 200GB per year. It's a lot of disks, but manageably so. I've purchased good quality 50GB blank disks and a burner, as well as a nice box and some silica gel packs to keep them cool, dark, dry, and generally protected. I'll be making one big initial backup, and then I'll run incremental backups ~monthly to capture new photos and edits to existing ones, at which time I'll also spot-check a disk or two for read errors using DVDisaster. I'm hoping to get 10 years out of this arrangement, though longer is of course better.

    I've got most of the pieces worked out, but the last big questi

  • Selfhosted @lemmy.world
    Possibly linux @lemmy.zip

    What CI/CD tools are you guys using? I have Forgejo but I need a simple way of running automation.

    My current picks are Woodpecker CI and Forgejo runners. Anything else that's lightweight and easy to manage?

  • Selfhosted @lemmy.world
    lent9004 @lemmy.world

    MAZANOKE v1.1.0: Self-hosted local image optimizer in your browser — now supports HEIC, clipboard paste, and more

    MAZANOKE is a simple image optimizer that runs in your browser, works offline, and keeps your images private without ever leaving your device.

    Created for everyday people and designed to be easily shared with family and friends, it serves as an alternative to questionable "free" online tools.

    See how you can easily self-host it here:
    https://github.com/civilblur/mazanoke


    Highlights from v1.1.0 (view full release note)

    I'm delighted to present some much-requested features in this release, including support for HEIC file conversion!

    • Added support to convert HEIC, AVIFJPG, PNG, WebP.
    • Paste image/files from clipboard to start optimization.
    • When setting a file size limit, you can switch between units MB and KB.
    • Remember last-used settings, stored locally in the browser.

    The support from the community has been incredibly encouraging, an

  • Selfhosted @lemmy.world
    hit_the_rails @reddthat.com

    Is it normal to not have any malicious login attempts?

    cross-posted from: https://reddthat.com/post/39309359

    I've been running Home Assistant for three years. It's port forwarded on default port 8123 via a reverse proxy in a dedicated VM serving it over HTTPS and is accessible over ipv4 and ipv6. All user accounts have MFA enabled.

    I see a notification every time there's a failed login attempt, but every single one is either me or someone in my house. I've never seen a notification for any other attempts from the internet. Not a single one.

    Is this normal? Or am I missing something? I expected it to be hammered with random failed logins.

  • Selfhosted @lemmy.world
    danb @feddit.uk
  • Selfhosted @lemmy.world
    Possibly linux @lemmy.zip

    Am I the only one interested in Fedora based containers?

    I'm been listening to the Fedora podcast and it seems like the OCI images are now getting some serious attention.

    Anyone using the Fedora base image to make custom containers to deploy Nextcloud, Caddy and other services? My thought is that Fedora focuses on security so in theory software packaged with it will be secure and properly configured by default. Having Fedora in the middle will also theoretically protect against hostile changes upstream. The downside is that the image is a little big but I think it is manageable.

    Anyone else use Fedora?

  • Selfhosted @lemmy.world
    whoareu @lemmy.ca

    finally got static IP from a new ISP

    Hello folks,

    I got my static IP and I am very happy now, I have been hosting a lot of services since I got the static IP, however I still have to host a fediverse service however it's not that easy to host any fediverse service, I tried to host GoToSocial but the devs said they don't support Podman and my server is podman only ( I tried installing docker but it was failing for some reason so I gave up and used podman instead of docker).

    these are the services I am currently hosting ( basically all the easy services which you can host with just "docker compose up -d" :p ):

    feel free to suggest some other cool services which I can host :D

  • Selfhosted @lemmy.world
    marauding_gibberish142 @lemmy.dbzer0.com

    How to self-host a highly available git server cluster?

    Edit: it seems like my explanation turned out to be too confusing. In simple terms, my topology would look something like this:

    I would have a reverse proxy hosted in front of multiple instances of git servers (let's take 5 for now). When a client performs an action, like pulling a repo/pushing to a repo, it would go through the reverse proxy and to one of the 5 instances. The changes would then be synced from that instance to the rest, achieving a highly available architecture.

    Basically, I want a highly available git server. Is this possible?


    I have been reading GitHub's blog on Spokes, their distributed system for Git. It's a great idea except I can't find where I can pull and self-host it from.

    Any ideas on how I can run a distributed cluster of Git servers? I'd like to run it in 3+ VMs + a VPS in the cloud so if something dies I still have a git server running somewhere to pull from.

    Thanks

  • Selfhosted @lemmy.world
    iturnedintoanewt @lemm.ee

    NanoKVM - What's the status?

    Hi guys! What's the status of the Sipeed NanoKVM FOSS image? I was subscribed to the thread, and I even saw Jeff Geerling's comments. Eventually they claimed the whole image was open source, and left it at that. If you go now to their github, the last published image is from February, v1.4..0. But everyone talks about the last upgrade to 2.2.5? In fact, if I connect my NanoKVM, it does detect that update, but I don't think it's the fully open sourced version? Is this correct?

    Anyone can provide a bit more detail on what's going on? Should I manually flash v1.4 that you can download from the repo? And if so...how do I do it?

    Thanks!

  • Selfhosted @lemmy.world
    GuardYaGrill @sh.itjust.works

    Force Lidarr to re-manage media?

    Long story short my Lidarr instance wasn’t creating Album folders and just dropping all the media into the Artists folder after importing, I noticed this because my Jellyfin instance was improperly displaying Albums as Playlists and not getting metadata as intended.

    This is quite unfortunate as a lot of content was downloaded and not properly organized, I tried going at it manually however quickly realized how much media was just loosely tossed around.

    Is there anyway to force Lidarr to re-manage media already imported or even a docker image designed specifically for media management that I could quickly spin up?

    Edit: I believe I already fixed the root cause of my issue above, just need to figure out a logical way of going about the content that is already messed up.