this post was submitted on 28 Oct 2024
1 points (100.0% liked)

TechTakes

1430 readers
16 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Last week's thread

(Semi-obligatory thanks to @dgerard for starting this)

(page 3) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 3 weeks ago (14 children)

eigenrobot:

almost every smart person I talk to in tech is in favor of mandatory eugenic polygynous marriages in order to deal with the fertility crisis. people are absolutely fed up with the lefty approach of using generational insolvency as a pretextual cudgel to install socialism.

[–] [email protected] 0 points 3 weeks ago

Every person I talk to — well, every smart person I talk to — no, wait, every smart person in tech — okay, almost every smart person I talk to in tech is a eugenicist. Ha, see, everybody agrees with me! Well, almost everybody…

[–] [email protected] 0 points 3 weeks ago (3 children)

Man, I didn't even know how to react to this nonsense. The obvious sneer is to point out that if the alternative is to interact with people like ER here we really shouldn't be surprised to see a declining birth rate. But I think the more important takeaway that this hints at is that these people are dumb and fundamentally incurious.

Like, there's plenty of surveys and research into why people are having fewer kids than they used to, and it's not because toddlers are little hellions more so than in the past. And "generational insolvency" is a pretty big fucking part of the explanation actually, as is empowering families to choose whether or not to have children rather than leaving it entirely up to the vicissitudes of biological processes and horniness. The latter part cuts both ways, in that people who want families are (theoretically; see above re: financial factors) able to take advantage of fertility treatments or IVF or whatever and have kids where they historically would have been unable to do so.

But no, rather than actually engage with any of that or otherwise treat the world like other people have agency they have identified what they believe to be the problem and have decided that the brute application of state power is the solution, so long as that power is being applied to other people. For all that we acknowledge the horrors of fascism, I think the stupidity of these people is also worth acknowledging, if for no other reason than to reinforce why this shit shouldn't be taken seriously.

load more comments (3 replies)
[–] [email protected] 0 points 3 weeks ago

aww, is the poor baby missing that maybe there's people who don't want to talk to them because of how much of a piece of shit they are? how sad

lefty approach of using generational insolvency as a pretextual cudgel to install socialism

this dipshit continues to make the most astounding not-even-wrong posts. guess they're angling for a job as the next Noahpinion or Yglesias

load more comments (11 replies)
[–] [email protected] 0 points 3 weeks ago (1 children)

Got linked to this UFO sightings timeline in Popbitch today. Thought it looked quite interesting and quite fun. Then I realized the information about individual UFO sightings was being supplied by bloody Co-pilor, and therefore was probably even less accurate than the average UFOlogy treatise.

PS: Does anyone know anything about using Arc-GIS to make maps? I have an assignment due tomorrow and I'm bricking it.

[–] [email protected] 0 points 3 weeks ago (2 children)

I know a bit about qgis + wfs (and surroundings)? enough to be dangerous tho

load more comments (2 replies)
[–] [email protected] 0 points 3 weeks ago (1 children)

https://www.infoworld.com/article/3595687/googles-flutter-framework-has-been-forked.html/

I’m currently using Flutter. It’s good! And useful! Much better than AI. It being mostly developed by Google has been a bit of a worry since Google is known to shoot itself in the foot by killing off its own products.

So while it’s no big deal to have an open source codebase forked, just wanted to highlight this part of the article:

Carroll also claimed that Google’s focus on AI caused the Flutter team to deprioritize desktop platforms, and he stressed the difficulty of working with the current Flutter team

Described as “Flutter+” by Carroll, Flock “will remain constantly up to date with Flutter, he said. Flock will add important bug fixes, and popular community features, which the Flutter team either can’t, or won’t implement.”

I hope this goes well!

[–] [email protected] 0 points 3 weeks ago (1 children)

that android project of some months was a venture into flutter (and haven’t touched it before)

I had similar impressions on some things, and mixed on other

dart’s a moderately good language with some nice primitive, tooling overall is pretty mature, broad strokes works well for variant targeting and shit

libraries though holy shit the current situation (then) was a mess. one minor flutter sdk upgrade and a whole bunch of things just exploded (serialisation things in nosql-type libraries I tried to use for the ostensible desired magic factor (just went back to sqlite stuff again after)). this can’t have been due to sdk drift alone, and felt like an iceberg problem

and then the documentation: fucking awful, for starting. excellent as technical documentation once you grok shit but before that all the examples and things are terrible. lots of extremely important details hidden in single mentions in offhand sentences in places that if you don’t happen to be looking at that exact page good luck finding it. this, too, felt like inadequate care and attention by el goog

I imagine if one is working with this every day you know the lay of the land and where to avoid stepping into holes, but wow was I surprised at how much it was possible to rapidly rakestep, given what the language pitches as

[–] [email protected] 0 points 3 weeks ago (1 children)

Yes to be clear when I say flutter is “good” I deliberately avoided a definition of “good”. I find it… usable.

[–] [email protected] 0 points 3 weeks ago

yep yep - didn’t mean to argue with your post inasmuch as to fill in details to the fork, but I guess I could’ve been clearer about that

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

I know it's Halloween, but this popped up in my feed and was too spooky even for me 😱

As a side note, what are peoples feelings about Wolfram? Smart dude for sho, but some of the shit he says just comes across as straight up pseudoscientific gobbledygook. But can he out guru Big Yud in a 1v1 on Final Destination (fox only, no items) ? 🤔

[–] [email protected] 0 points 3 weeks ago (4 children)
[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

on a side note, I notice this passage in the review:

Wolfram refers incessantly to his "discovery" that simple rules can produce complex results. Now, the word "discovery" here is legitimate, but only in a special sense. When I took pre-calculus in high school, I came up with a method for solving systems of linear equations, independent of my textbook and my teacher: I discovered it. My teacher, more patient than I would be with adolescent arrogance, gently informed me that it was a standard technique, in any book on linear algebra, called "reduction to Jordan normal form", after the man who discovered it in the 1800s. Wolfram discovered simple rules producing complexity in just the same way that I discovered Jordan normal form.

this is certainly mistaken. I think the author or teacher must have meant RREF or something to that effect, not Jordan normal form

[–] [email protected] 0 points 3 weeks ago (1 children)

Gauss–Jordan elimination, maybe?

load more comments (1 replies)
load more comments (1 replies)
load more comments (3 replies)
[–] [email protected] 0 points 3 weeks ago

The big difference is that Yud is unrigorous while Wolfram is a plagiarist. Or maybe putting it another way, Yud can't write proofs and Wolfram can't write bibliographies.

[–] [email protected] 0 points 3 weeks ago (1 children)

I mean yud is only really a guru on his own farts

[–] [email protected] 0 points 3 weeks ago

classic warning case of someone getting high on their own supply

[–] [email protected] 0 points 3 weeks ago (1 children)

Microsoft found a fitting way to punish AI for collaborating with SEO spammers in generating slop: make it use the GitHub code review tools. https://github.blog/changelog/2024-10-29-refine-and-validate-code-review-suggestions-with-copilot-workspace-public-preview/

[–] [email protected] 0 points 3 weeks ago (2 children)

we really shouldn’t have let Microsoft both fork an editor and buy GitHub, of course they were gonna turn one into a really shitty version of the other

anyway check this extremely valuable suggestion from Copilot in one of their screenshots:

The error message 'userld and score are required' is unclear. It should be more specific, such as 'Missing userld or score in the request body'.

aren’t you salivating for a Copilot subscription? it turns a lazy error message into… no that’s still lazy as shit actually, who is this for?

  • a human reading this still needs to consult external documentation to know what userId and score are
  • a machine can’t read this
  • if you’re going for consistent error messages or you’re looking to match the docs (extremely likely in a project that’s in production), arbitrarily changing that error so it doesn’t match anything else in the project probably isn’t a great idea, and we know LLMs don’t do consistency
[–] [email protected] 0 points 3 weeks ago (4 children)

I want someone to fork the Linux kernel and then unleash like 10 Copilots to make PRs and review each other. No human intervention. Then plot the number of critical security vulnerabilities introduced over time, assuming they can even keep it compilable for long enough.

[–] [email protected] 0 points 3 weeks ago (1 children)

that’d be an interesting experiment but also that’s $2400 you could spend on more useful things, like bootstrapping your whiskey collection

load more comments (1 replies)
load more comments (3 replies)
[–] [email protected] 0 points 3 weeks ago

@self did somebody make an extension that replaces github copilot with ELIZA yet

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago)

Cursed .gov link:

https://www.state.gov/secretary-antony-j-blinken-at-the-advancing-sustainable-development-through-safe-secure-and-trustworthy-ai-event/

TL;DR: Our main characters have bilked a very credulous US State Department. 100 Million tax dollars will now be converted into entropy. There will also be committees.

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Go home Coursera, you're drunk.

Want to get even better results with GenAI? The new Google Prompting Essentials course will teach you 5 easy steps to write effective prompts for consistent, useful results.

Note: I had to highlight the text because the email's text was white-on-white.

[–] [email protected] 0 points 3 weeks ago

Thanks, Google. You know, I used to be pretty good at getting consistent, useful results from your search engine, but the improvements you've made to it since the make me feel like I really might need a fucking prompt engineering course to find things on the internet these days. By which I mean something that'll help you promptly engineer the internet back into a form where search engines work correctly.

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Is there a group that more consistently makes category errors than computer scientists? Can we mandate Philosophy 101 as a pre-req to shitting out research papers?

Edit: maybe I need to take a break from Mystery AI Hype Theater 3000.

[–] [email protected] 0 points 3 weeks ago

Chat-GPT-TFSD-21guns can have a little anthropomorphism, as a treat.

  • Sam Altman, probably
[–] [email protected] 0 points 3 weeks ago (1 children)

Jingna Zhang found an AI corp saying the quiet part out loud:

In a previous post of mine, I noted how the public generally feels that the jobs people want to do (mainly creative jobs) are the ones being chiefly threatened by AI, with the dangerous, boring and generally garbage jobs being left relatively untouched.

Looking at this, I suspect the public views anyone working on/boosting AI as someone who knows full well their actions are threatening people's livelihoods/dream jobs, and is actively, willingly and intentionally threatening them, either out of jealousy for those who took the time to develop the skills, or out of simple capitalist greed.

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

I thought the Raytheon ads for tanks and knife missiles in the Huntsville, AL airport were bad, but this takes the whole goddamn cake, and two scoops of ice cream with it.

[–] [email protected] 0 points 3 weeks ago

Ah, Huntsville. Where the downtown convention hall is the Werner von Braun Center.

🎶 the man whose allegiance is ruled by expedience 🎶

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago) (9 children)

You don't get it, this is a likely bribe

[–] [email protected] 0 points 3 weeks ago (1 children)

yikes, good call! I couldn't get past the Borderlands 2 vibes, but you're right.

[–] [email protected] 0 points 3 weeks ago

it's like generalized manufacturing consent

load more comments (8 replies)
[–] [email protected] 0 points 3 weeks ago (2 children)

Raytheon can at least claim they're helping kill terrorists or some shit like that, Artisan's just going out and saying "We ruin good people's lives for money, and we can help you do that too"

[–] [email protected] 0 points 3 weeks ago

Grift tech that claims to do awful shit that ruins everyone's lives, but really just makes Stanford grads sit around pretending to invent something while funneling VC money directly in their bloodstreams.

You'd think these would overflow the evil scale and end up back into being ethical but really they're just doing the same thing as the non-vaporware evil companies with just some extra steps.

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago)

Right? At least it the knife missile does what it says on the tin.

Apologies in advance for the Rick and Morty reference, but Artisan seems to be roughly congruent to "Simple Rick" candy bars.

The (poorly executed) distillation of the life's work of actually talented and interesting people, sold as a direct replacement, to fill a void that the customer doesn't even know exists.

[–] [email protected] 0 points 3 weeks ago (1 children)

Zuck says lots more slop coming your way soon

"I think were going to add a whole new category of content which is AI generated or AI summarized content, or existing content pulled together by AI in some way,” the Meta CEO said. “And I think that that’s gonna be very exciting for Facebook and Instagram and maybe Threads, or other kinds of feed experiences over time."

Facebook is already one Meta platform where AI generated content, sometimes referred to as “AI slop,” is increasingly common.

[–] [email protected] 0 points 3 weeks ago

mm I wonder what kind of content they'll want to use that for

load more comments
view more: ‹ prev next ›