Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
view the rest of the comments
Its an extremely fast and insecure way to setup services. Avoid it unless you want to download and execute malicious code.
Please explain this to me
Package managers like apt use cryptography to check signatures in everything they download to make sure they aren't malicious.
Docker doesn't do this. They have a system called DCT but its horribly broken (not to mention off by default).
So when you run
docker pull
, you can't trust anything it downloads.Thank you very much! For the off by default part i can agree, but why it's horribly broken?
PKI.
Apt and most release signing has a root of trust shipped with the OS and the PGP keys are cross signed on keyservers (web of trust).
DCT is just TOFU. They disable it because it gives a false sense of security. Docker is just not safe. Maybe on 10 years they'll fix it, but honestly it seems like they just dont care. The well is poisoned. Avoid. Use apt or some package manager that actually cares about security
So, if I understand correctly: rather than using prebuilt images from Docker Hub or untrusted sources, the recommended approach is to start from a minimal base image of a known OS (like Debian or Ubuntu), and explicitly install required packages via apt within the Dockerfile to ensure provenance and security. Does that make sense?
Install the package with apt. Avoid docker completely.
If the docker image maintainer has a github, open a ticket asking them to publish a Debian package
I see your point about trusting signed Debian packages, and I agree that’s ideal when possible. But Docker and APT serve very different purposes — one is for OS-level package management, the other for containerization and isolation. That’s actually where I got a bit confused by your answer — it felt like you were comparing tools with different goals (due to my limited knowledge). My intent isn’t just to install software, but to run it in a clean, reproducible, and isolated environment (maybe more than one in the same hosting machine). That’s why I’m considering building my own container from a minimal Debian base and installing everything via apt inside it, to preserve trust while still using containers responsibly! Does this makes sense for you? Thank you again for wasting your time to reply to my dumb messages
Containers have been around for decades. Look into lxc.
But for the best security, you want VMs. Look into proxmox.
Thank you man! I will look further into that
You know container image attestations are a thing, right?
You know it doesn't verify any signature on download, right?
A signature only tells you where something came from, not whether it’s safe. Saying APT is more secure than Docker just because it checks signatures is like saying a mysterious package from a stranger is safer because it includes a signed postcard and matches the delivery company’s database. You still have to trust both the sender and the delivery company. Sure, it’s important to reject signatures you don’t recognize—but the bigger question is: who do you trust?
APT trusts its keyring. Docker pulls over HTTPS with TLS, which already ensures you’re talking to the right registry. If you trust the registry and the image source, that’s often enough. If you don’t, tools like Cosign let you verify signatures. Pulling random images is just as risky as adding sketchy PPAs or running curl | bash—unless, again, you trust the source. I certainly trust Debian and Ubuntu more than Docker the company, but “no signature = insecure” misses the point.
Pointing out supply chain risks is good. But calling Docker “insecure” without nuance shuts down discussion and doesn’t help anyone think more critically about safer practices.
Oof, TLS isnt a replacement for signatures. There's a reason most package managers use release signatures. x.509 is broken.
And, yes PGP has a WoT to solve its PKI. That's why we can trust apt sigs and not docker sigs.
Entirely depends on who's publishing the image. Many projects publish their own images, in which case you're running their code regardless.
Nope. See DCT. Its a joke.
Use apt.