this post was submitted on 18 Apr 2025
436 points (98.0% liked)
Showerthoughts
33648 readers
873 users here now
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.
Here are some examples to inspire your own showerthoughts:
- Both “200” and “160” are 2 minutes in microwave math
- When you’re a kid, you don’t realize you’re also watching your mom and dad grow up.
- More dreams have been destroyed by alarm clocks than anything else
Rules
- All posts must be showerthoughts
- The entire showerthought must be in the title
- No politics
- If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
- A good place for politics is c/politicaldiscussion
- Posts must be original/unique
- Adhere to Lemmy's Code of Conduct and the TOS
If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.
Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm not very familiar with LLMs. How do you install a local copy?
Lookup Alpaca and Ollama. If you are using Linux they are just a Flatpak away.
If not, you can go with Ollama in docker format with a Open-WebUI frontend.
The model I used was Llama3.2 and basically told it to simulate GlaDOS.
You can also just tell your favorite one to do that, if that's what you're after or have a really bad GPU.
LM studio is the most stable and user friendly that I've found by far, but try to download a model that fits inside your GPU or else the model will be super slow or crash.
LM Studio is probably the easiest way
No, promote open-source platforms; LMS is closed-source. Try https://jan.ai/ instead.
Fair, I use Open WebUI + Ollama personally but it's slightly tricky to set up, wasn't aware there were open source options with a built in model browser and hardware compatibility estimates
Thanks Ollama