this post was submitted on 25 Dec 2023
37 points (95.1% liked)
People Twitter
5483 readers
1931 users here now
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a pic of the tweet or similar. No direct links to the tweet.
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's getting there. In the next few years as hardware gets better and models get more efficient we'll be able to run these systems entirely locally.
I'm already doing it, but I have some higher end hardware.
Could you please share your process for us mortals ?
Stable diffusion SXDL Turbo model running in Automatic1111 for image generation.
Ollama with Ollama-webui for an LLM. I like the Solar:7b model. It's lightweight, fast, and gives really good results.
I have some beefy hardware that I run it on, but it's not necessary to have.