VoterFrog

joined 1 year ago
[–] [email protected] 1 points 6 months ago

Or they build trebuchets

[–] [email protected] 7 points 7 months ago

Yes, what I'm saying is that lower costs for software, which AI will help with, will make software more competitive against human production labor. The standard assumption is that if software companies can reduce the cost of producing software, they'll start firing programmers but the entire history of software engineering has shown us that that's not true as long as the lower cost opens up new economic opportunities for software users, thus increasing demand.

That pattern stops only when there are no economic opportunities to be unlocked. The only way I think that happens is when automation has become so prevalent that further advancement has minimal impact. I don't think we're there yet. Labor costs are still huge and automation is still relatively primitive.

[–] [email protected] 13 points 7 months ago (2 children)

One thing that is somewhat unique about software engineering is that a large part of it is dedicated to making itself more efficient and always has been. From programming languages, protocols, frameworks, and services, all of it has made programmers thousands of times more efficient than the guys who used to punch holes into cards to program the computer.

Nothing has infinite demand, clearly, but the question is more whether or not we're anywhere near the peak, such that more efficiency will result in an overall decrease in employment. So far, the answer has been no. The industry has only grown as it's become more efficient.

I still think the answer is no. There's far more of our lives and the way people do business that can be automated as the cost of doing so is reduced. I don't think we're close to any kind of maximum saturation of tech.

[–] [email protected] 3 points 7 months ago

Did Starfield only cost 4x as much to make as HiFi? Doubt it. I'd bet the marketing budget of Starfield alone dwarfed the lifetime cost of HiFi. I agree that "bombed" is maybe too harsh but the problem that the article is talking about is ROI. As I continues to balloon, R needs to keep up and it's not.

[–] [email protected] 5 points 7 months ago

Yeah there's definitely some overlap. Lots of dark UX is used for enshittification but sometimes enshittification is just plainly bold bad UX for the sake of making money with a hint of "Yeah it's bad. What are you going to do about it?"

On the other hand, enshittification is part of a cycle that starts with a service that grows dominant at least in part by providing a great experience, only to tear that experience down when it gets in the way of making money. Dark UX isn't always part of that cycle. Plenty of services of all sizes use these patterns right from the start. Not really accurate to call it "enshittification" when it was always just shit.

[–] [email protected] 4 points 7 months ago

You don't see the difference between distributing someone else's content against their will and using their content for statistical analysis? There's a pretty clear difference between the two, especially as fair use is concerned.

[–] [email protected] 4 points 7 months ago (1 children)

I think that undersells most of the compelling open source libraries though. The one line or one function open source libraries could be starved, I guess. But entire frameworks are open source. We're not at the point yet where AI can develop software on that scale.

[–] [email protected] 12 points 7 months ago (4 children)

Why do you think AI will starve open source?