this post was submitted on 20 Apr 2024
1 points (100.0% liked)

LocalLLaMA

2249 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS
 

Consider this hypothetical scenario: if you were given $100,000 to build a PC/server to run open-source LLMs like LLaMA 3 for single-user purposes, what would you build?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 6 months ago

4 of whatever modern GPU has the most vram currently. (So I can run 4 personalities at the same time)

Whatever the best amd epyc cpu currently is.

As much ECC ram as possible.

Waifu themes all over the computer.

Linux, LTS edition.

A bunch of nvme SSDs configured redundantly.

And 2 RTX 4090s. (One for the host and one for me)