this post was submitted on 25 Oct 2024
1 points (100.0% liked)

TechTakes

1432 readers
16 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
(page 3) 6 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 1 month ago

The release of this next model comes at a crucial time for OpenAI, which just closed a historic $6.6 billion funding round that requires the company to restructure itself as a for-profit entity. The company is also experiencing significant staff turnover: CTO Mira Murati just announced her departure along with Bob McGrew, the company’s chief research officer, and Barret Zoph, VP of post training.

All the problems with “AI” are suddenly solved now that Altman needs to justify his funding. I’m sure senior executives are jumping ship right on the cusp of their great triumph, because they want to spend more time with their families.

[–] [email protected] 0 points 1 month ago (9 children)

I heard openai execs are so scared of how powerful the next model will be that they're literally shitting themselves every day thinking about it. they don't even clean it up anymore, the openai office is one of the worst smelling places on earth

load more comments (9 replies)
[–] [email protected] 0 points 1 month ago (4 children)

really stretching the meaning of the word release past breaking if it’s only going to be available to companies friendly with OpenAI

Orion has been teased by an OpenAI executive as potentially up to 100 times more powerful than GPT-4; it’s separate from the o1 reasoning model OpenAI released in September. The company’s goal is to combine its LLMs over time to create an even more capable model that could eventually be called artificial general intelligence, or AGI.

so I’m calling it now, this absolute horseshit’s only purpose is desperate critihype. as with previous rounds of this exact same thing, it’ll only exist to give AI influencers a way to feel superior in conversation and grift more research funds. oh of course Strawberry fucks up that prompt but look, my advance access to Orion does so well I’m sure you’ll agree with me it’s AGI! no you can’t prompt it yourself or know how many times I ran the prompt why would I let you do that

That timing lines up with a cryptic post on X by OpenAI Altman, in which he said he was “excited for the winter constellations to rise soon.” If you ask ChatGPT o1-preview what Altman’s post is hiding, it will tell you that he’s hinting at the word Orion, which is the winter constellation that’s most visible in the night sky from November to February (but it also hallucinates that you can rearrange the letters to spell “ORION”).

there’s something incredibly embarrassing about the fact that Sammy announced the name like a lazy ARG based on a GPT response, which GPT proceeded to absolutely fuck up when asked about. a lot like Strawberry really — there’s so much Binance energy in naming the new version of your product after the stupid shit the last version fucked up, especially if the new version doesn’t fix the problem

[–] [email protected] 0 points 1 month ago (6 children)

You forgot the best part, the screenshot of the person asking ChatGPT's "thinking" model what Altman was hiding:

Thought for 95 seconds ... Rearranging the letters in "they are so great" can form the word ORION.

AI is a complete joke, and I have no idea how anyone can think otherwise.

load more comments (6 replies)
load more comments (3 replies)
[–] [email protected] 0 points 1 month ago (22 children)

Every model they've released (after 4) has been seemingly worse than the previous.

load more comments (22 replies)
[–] [email protected] 0 points 1 month ago (3 children)

So how many ChatGPT 4s have they precariously stacked up on top of each other this time?

load more comments (3 replies)
load more comments
view more: ‹ prev next ›