I see people mentioning small office desktops, and they are good, but I will warn you that they use proprietary parts so upgrading and repairing them can be difficult. Also jellyfin.org has some good info under the hardware acceleration section for what to use.
Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
A used mini computer, like a lenovo thinkcentre, hp prodesk mini, and dell optiplex micro.
Dunno about affordable but you can usually find some decently priced 1L Dell Optiplex micro systems. I’ve got one running under my desk 24/7. Great Linux support.
You can try the Minisforum MS-01. Relatively compact, inexpensive, with a lot of options for expandability as well as relatively powerful Intel CPUs with QuickSync for LLM and transcode. Here is a nice overview of the device.
If you want to run Ollama and other ML stuff, you're looking at buying an RTX4090, my friend. Affordable and ML are two things you can't put into one sentence.
While you certainly can run AI models that require such a beefy GPU, there are plenty of models that run fine even on a CPU-only system. So it really depends on what exactly Ollama is going to be used for.
I am satisfied if it can run a 7/8B relatively fast
Go on ebay or your local 2nd hand market and search "mini PC" or "Office computer"...
How small? How many drives? I bought several used Lenovo P330 E2276G for my servers.
The Intel CPU has great low power GPU for video encoding/decoding for video streaming.
The Xeon ECC ram gives long term reliability. It's important if you leave your PC on 24/7 for years at a time.
I am not a big fan of buying used.
Used servers/workstations are likely more reliable than new consumer.
They were very likely kept temperature controlled, have ECC, and are actually known working instead of something like Asus. If I remember correctly, PC mortality is very high the first 6 months, goes down to near zero for 5 years, then starts going back up.
Replace the SSD/hard drive and you are good. You might not even have to do that. I checked the stats on the SSD that came with my used Lenovo workstation and it had like 20 hours on it.
Why not? It would help massively with the 'affordable' criterion
I just need something that works. I've had a bad experience with a previous model that wouldn't boot on my Ubuntu server drive, no matter how much time I spent on it. But if you know of any models that are worth checking out, I'm all ears.
Might be worth trying to find a refurbished HP ProLiant MicroServer. There are a few on eBay UK within the £200-400 range. You can sometimes find professionally refurbished units as well.
I love my NUCs but haven't really paid attention to what has happened since Intel sold that line to ASUS.
Thanks i will take a look at the NUC.
what are you gonna use it for?
something that are not arm based I’m looking to set up a system that can run Jellyfin, Ollama, and a few other small services. By ‘pre-built’, I mean I want to buy a device that already has the necessary hardware and installation, so all I need to do is install the operating system and I’m good to go
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I've seen in this thread:
Fewer Letters | More Letters |
---|---|
NUC | Next Unit of Computing brand of Intel small computers |
NVMe | Non-Volatile Memory Express interface for mass storage |
Plex | Brand of media server package |
SSD | Solid State Drive mass storage |
[Thread #785 for this sub, first seen 5th Jun 2024, 13:55] [FAQ] [Full list] [Contact] [Source code]
I've had a good experience so far with two minipcs, mele quieter 4c for kodi, and a morefine m9 (I think this one is branded as mipowcat in the EU). They're both n100, the m9 can go up to 32gb of ram although it is picky about what modules it will accept. I use the m9 for jellyfin and about 10 other services. Quick sync works great as far as I've tested it. For jellyfin I'm relying mostly on direct streaming, but I tried a few episodes with forcing some transcoding by using Firefox for playback and it worked fine.
I like my HPE microserver gen10+
Although it does not come with a GPU by default, but you can install a low power one.
What are the reviews on hardware offered by umbrelos guys?
Edit: something that are not arm based
You want pre-built to run ollama, that's at least gonna cost you an arm, maybe even a leg.
It depends on the model you run. Mistral, Gemma, or Phi are great for a majority of devices, even with CPU or integrated graphics inference.
Check out used tiny/mini/micro desktops on eBay. Loads of info here: https://www.servethehome.com/introducing-project-tinyminimicro-home-lab-revolution/
Only downside is going to be no GPU for the AI workload. Maybe some of the later AMD APUs could cut it. If not, all three major manufacturers have SFF variants that are pretty much the same hardware in a little bigger case. Those will accept smaller off the shelf GPUs.
2 bay nas with a Ryzen7 and up to 32GB ram.
There is also a cheaper option with an N100
Thanks i will take a look at that.
You need to first explain what you want the server for, because that will give us an idea of your CPU and storage requirements.
I'm looking to set up a system that can run Jellyfin, Ollama, and a few other small services.
Ollama is a big thing, do you want it to be fast? You will need a GPU, how large is the model you will be running be? 7/8B with CPU not as fast but no problem. 13B slow with CPU but possible
I am satisfied if it can run a 7/8B relatively fast
I'm not sure if they're still affordable but I ended up getting both a morefine and a beelink, one with the n100 Intel CPU and the other with the n305. They handle everything I've thrown at them, and come with out of the box quicksync transcoding for Jellyfin/Plex. Handles 4K transcode like a champ. Couple that with 2.5g Ethernet and they sip power. Though they might have gone up in price since I bought mine.
I have a beelink running jellyfin and it's fine.
It has the n100.
Prebuilt like a traditional server? I personally use an orange pi and it's pretty good. Just make sure to use the open source arm OS.
Raspberry PI