Why you gotta invent a new hardware just to speed up ai augh these companies
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
Argh, after 25 years in tech I am surprised this keeps surprising you.
We’ve crested for sure. AI isn’t going to solve everything. AI stock will fall. Investor pressure to put AI into everything will subside.
The we will start looking at AI as a cost benefit analysis. We will start applying it where it makes sense. Things will get optimised. Real profit and long term change will happen over 5-10 years. And afterwards, the utter magical will seem mundane while everyone is chasing the next hype cycle.
I'm far far more concerned about all the people who were deemed non essential so quickly after being "essential" for so long because AI will do so much work slaps employees with 2 weeks severance
I’m right there with you. One of my daughters love drawing and designing clothes and I don’t know what to tell her in terms of the future. Will human designs be more valued? Less valued?
I’m trying to remain positive; when I went into software my parents barely understood that anyone could make a living of that “toy computer”.
But I agree; this one feels different. I’m hoping they all feel different to the older folks (me).
Truth. I would say the actual time scales will be longer, but this is the harsh, soul-crushing reality that will make all the kids and mentally disturbed cultists on r/singularity scream in pain and throw stones at you. They're literally planning for what they're going to do once ASI changes the world to a star-trek, post-scarcity civilization... in five years. I wish I was kidding.
So should we be fearing a new crash?
Do you have money and/or personal emotional validation tied up in the promise that AI will develop into a world-changing technology by 2027? With AGI in everyone's pocket giving them financial advice, advising them on their lives, and romancing them like a best friend with Scarlett Johansson's voice whispering reassurances in your ear all day?
If you are banking on any of these things, then yeah, you should probably be afraid.
Have any regular users actually looked at the prices of the "AI services" and what they actually cost?
I'm a writer. I've looked at a few of the AI services aimed at writers. These companies literally think they can get away with "Just Another Streaming Service" pricing, in an era where people are getting really really sceptical about subscribing to yet another streaming service and cancelling the ones they don't care about that much. As a broke ass writer, I was glad that, with NaNoWriMo discount, I could buy Scrivener for €20 instead of regular price of €40. [note: regular price of Scrivener is apparently €70 now, and this is pretty aggravating.] So why are NaNoWriMo pushing ProWritingAid, a service that runs €10-€12 per month? This is definitely out of the reach of broke ass writers.
Someone should tell the AI companies that regular people don't want to subscribe to random subscription services any more.
As someone dabbling with writing, I bit the bullet and tried to start looking into the tools to see if they're actually useful, and I was impressed with the promised tools like grammar help, sentence structure and making sure I don't leave loose ends in the story writing, these are genuinely useful tools if you're not using generative capability to let it write mediocre bullshit for you.
But I noticed right away that I couldn't justify a subscription between $20 - $30 a month, on top of the thousand other services we have to pay monthly for, including even the writing software itself.
I have lived fine and written great things in the past without AI, I can survive just fine without it now. If these companies want to actually sell a product that people want, they need to scale back the expectations, the costs and the bloated, useless bullshit attached to it all.
At some point soon, the costs of running these massive LLM's versus the number of people actually willing to pay a premium for them are going to exceed reasonable expectations and we will see the companies that host the LLM's start to scale everything back as they try to find some new product to hype and generate investment on.
I work for an AI company that's dying out. We're trying to charge companies $30k a year and upwards for basically chatgpt plus a few shoddily built integrations. You can build the same things we're doing with Zapier, at around $35 a month. The management are baffled as to why we're not closing any of our deals, and it's SO obvious to me - we're too fucking expensive and there's nothing unique with our service.
That's a good point about the "AI as a service" model that is emerging.
I was reading that NaNoWriMo has had a significant turnover on their board die to the backlash against their pro-AI stance: https://www.cbc.ca/news/entertainment/nanowrimo-ai-controversty-1.7314090
What do people mean with "AI bubble"?
As in as soon as companies realise they won't be able to lay off everybody except executives and personal masseuses, nVidia will go back to having a normal stock price.
Rich people will become slightly less grotesquely wealthy, and everything must be done to prevent this.
The term "AI bubble" refers to the idea that the excitement, investment, and hype surrounding artificial intelligence (AI) may be growing at an unsustainable rate, much like historical financial or technological bubbles (e.g., the dot-com bubble of the late 1990s). Here are some key aspects of this concept:
-
Overvaluation and Speculation: Investors and companies are pouring significant amounts of money into AI technologies, sometimes without fully understanding the technology or its realistic potential. This could lead to overvaluation of AI companies and startups.
-
Hype vs. Reality: There is often a mismatch between what people believe AI can achieve in the short term and what it is currently capable of. Some claims about AI may be exaggerated, leading to inflated expectations that cannot be met.
-
Risk of Market Crash: Like previous bubbles in history, if AI does not deliver on its overhyped promises, there could be a significant drop in AI investments, stock prices, and general interest. This could result in a burst of the "AI bubble," causing financial losses and slowing down real progress.
-
Comparison to Previous Bubbles: The "AI bubble" is compared to the dot-com bubble or the housing bubble, where early optimism led to massive growth and investment, followed by a sudden collapse when the reality didn't meet expectations.
Not everyone believes an AI bubble is forming, but the term is often used as a cautionary reference, urging people to balance enthusiasm with realistic expectations about the technology’s development and adoption.
Not everyone believes an AI bubble is forming
Well, the AI's not wrong. No one believes a bubble is forming, since it's already about to burst!
Thank you for the explanation
The fact that you used AI to write this is... perfection.
What?! Of course I didn't! You're imagining things!