this post was submitted on 23 Mar 2024
650 points (98.5% liked)

Programmer Humor

19593 readers
709 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
top 10 comments
sorted by: hot top controversial new old
[–] [email protected] 6 points 8 months ago

Me looking at my unit tests failing one after another.

[–] [email protected] 23 points 8 months ago* (last edited 8 months ago) (1 children)

Yeah but have you ever coded shaders? That shit’s magic sometimes. Also a pain to debug, you have to look at colors or sometimes millions of numbers trough a frame analyzer to see what you did wrong. Can’t program messages to a log.

[–] [email protected] 2 points 8 months ago

Can't you run it on an emulator for debugging, Valgrind-style?

[–] [email protected] 91 points 8 months ago (1 children)

Computers don't do what you want, they do what you tell them to do.

[–] [email protected] 48 points 8 months ago (1 children)

Exactly, they're passive aggressive af

[–] [email protected] 20 points 8 months ago (1 children)

I wouldn't call them passive, they do too much work. More like aggressively submissive.

[–] [email protected] 19 points 8 months ago* (last edited 8 months ago)

Maliciously compliant perhaps

They do what you tell them, but only exactly what and how you tell them. If you leave any uncertainty chances are it will fuck up the task

[–] [email protected] 51 points 8 months ago (1 children)

Stupid code! Oh, looks like this was my fault again....this time

[–] [email protected] 14 points 8 months ago (1 children)

Must've been chatGPT's fault

[–] [email protected] 8 points 8 months ago

My experience is that: If you don't know exactly what code the AI should output, it's just stack overflow with extra steps.

Currently I'm using a 7B model, so that could be why?