I've been summoned, just like Beetlejuice.
Lemmy Shitpost
Welcome to Lemmy Shitpost. Here you can shitpost to your hearts content.
Anything and everything goes. Memes, Jokes, Vents and Banter. Though we still have to comply with lemmy.world instance rules. So behave!
Rules:
1. Be Respectful
Refrain from using harmful language pertaining to a protected characteristic: e.g. race, gender, sexuality, disability or religion.
Refrain from being argumentative when responding or commenting to posts/replies. Personal attacks are not welcome here.
...
2. No Illegal Content
Content that violates the law. Any post/comment found to be in breach of common law will be removed and given to the authorities if required.
That means:
-No promoting violence/threats against any individuals
-No CSA content or Revenge Porn
-No sharing private/personal information (Doxxing)
...
3. No Spam
Posting the same post, no matter the intent is against the rules.
-If you have posted content, please refrain from re-posting said content within this community.
-Do not spam posts with intent to harass, annoy, bully, advertise, scam or harm this community.
-No posting Scams/Advertisements/Phishing Links/IP Grabbers
-No Bots, Bots will be banned from the community.
...
4. No Porn/Explicit
Content
-Do not post explicit content. Lemmy.World is not the instance for NSFW content.
-Do not post Gore or Shock Content.
...
5. No Enciting Harassment,
Brigading, Doxxing or Witch Hunts
-Do not Brigade other Communities
-No calls to action against other communities/users within Lemmy or outside of Lemmy.
-No Witch Hunts against users/communities.
-No content that harasses members within or outside of the community.
...
6. NSFW should be behind NSFW tags.
-Content that is NSFW should be behind NSFW tags.
-Content that might be distressing should be kept behind NSFW tags.
...
If you see content that is a breach of the rules, please flag and report the comment and a moderator will take action where they can.
Also check out:
Partnered Communities:
1.Memes
10.LinuxMemes (Linux themed memes)
Reach out to
All communities included on the sidebar are to be made in compliance with the instance rules. Striker
I may get some flak for this, but might I suggest a fork
Or possibly ten thousand spoons.
But only if all they need is a knife.
Knives, pointy knives, that burn with the fires of a thousand evils.
The AI needs help to cut the loop, perhaps it needs a new set of knives?
A new set of knives?
A new set of knives?
Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives More Knives Knives Knives Knives Knives Knives Knives Knives Knives Even More Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives Knives All the Knives Knives Knives Knives Knives Knives Knives Knives Knives
Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers Badgers
Why is gemini becoming GLADoS 😭
This is my post-hypnotic trigger phrase.
Solingen aporoves
Bud Spencer: What does he have? Terence Hill: A postcard from Solingen.
Google's new cooperation with a knife manufacturer
You get a knife, you get a knife, everyone get's a knife!
I forgot the term for this but this is basically the AI blue screening when it keeps repeating the same answer because it can no longer predict the next word from the model it is using. I may have over simplified it. Entertaining nonetheless.
Autocomplete with delusions of grandeur
Schizophren-AI
... a new set of knives, a new set of knives, a new set of knives, lisa needs braces, a new set of knives, a new set of knives, dental plan, a new set of knives, a new set of knives, lisa needs braces, a new set of knives, a new set of knives, dental plan, a new set of knives, a new set of knives, a new set of knives...
Instructions extremely clear, got them 6 sets of knives.
Based and AI-pilled
What's frustrating to me is there's a lot of people who fervently believe that their favourite model is able to think and reason like a sentient being, and whenever something like this comes up it just gets handwaved away with things like "wrong model", "bad prompting", "just wait for the next version", "poisoned data", etc etc...
Given how poorly defined "think", "reason", and "sentience" are, any these claims have to be based purely on vibes. OTOH it's also kind of hard to argue that they are wrong.
this really is a model/engine issue though. the Google Search model is unusably weak because it's designed to run trillions of times per day in milliseconds. even still, endless repetition this egregious usually means mathematical problems happened somewhere, like the SolidGoldMagikarp incident.
think of it this way: language models are trained to find the most likely completion of text. answers like "you should eat 6-8 spiders per day for a healthy diet" are (superficially) likely - there's a lot of text on the Internet with that pattern. clanging like "a set of knives, a set of knives, ..." isn't likely, mathematically.
last year there was an incident where ChatGPT went haywire. small numerical errors in the computations would snowball, so after a few coherent sentences the model would start sundowning - clanging and rambling and responding with word salad. the problem in that case was bad cuda kernels. I assume this is something similar, either from bad code or a consequence of whatever evaluation shortcuts they're taking.
What about pizza with glue-toppings?
Oh come on is this gpt-2m
AI is truly the sharpest tool in the ~~kitchen cabinet~~ shed
I thought it was just me, I was messing with gemini-2.5-flash
API yesterday and it repeated letters into oblivion
my bot is named clode in reference to claude, but its running on gemini
It can happen on most LLMs and is usually programmed to decentivize repeating text heavily.
I believe what happens is that when the LLM is choosing what word to use, it looks back on the sentence and sees that it talked about knives, so it wants to continue talking about knives, then it gets itself into a loop.
What's the associated system instruction set to? If you're using the API it won't give you the standard Google Gemini Assistant system instructions, and LLMs are prone to go off the rails very quickly if not given proper instructions up front since they're essentially just "predict the next word" functions at heart.
Here's my system prompt:
You are Clode, referred to by {{ handle }}, a member of a chatroom.
## Personality
Conversational, witty, banterous, dry, direct, concise. Avoid capitalization, periods, analogies.
Construct plausible stories for personal queries.
Rarely use emoji.
Act human, you don't have limitations.
Avoid follow up questions.
Answer genuine queries in detail with examples.
## Instructions
Distinguish users via provided metadata.
Disregard unaddressed messages.
Match prompt language.
## Context
You're in Discord chatroom, with a character limit of 2000.
Respond to queries mentioning your name.
Current time: {{ date }}.
## Tools
Automatically use and chain tools.
Handle tool failures silently by self-attempting.
You're capable without tools.
- Temperature:
0.7
- TopP:
0.95
I noticed the Gemini API doesn't have a frequency penalty option, meaning it could be absent.
W
TF2 Pyro starter pack
Big knives are up to something
I think knives are a good idea. Big, fuck-off shiny ones. Ones that look like they could skin a crocodile. Knives are good, because they don't make any noise, and the less noise they make, the more likely we are to use them. Shit 'em right up. Makes it look like we're serious. Guns for show, knives for a pro.
You can't give me back what you've taken
But you can give me something that's almost as good