this post was submitted on 18 Mar 2025
1081 points (97.4% liked)
People Twitter
6449 readers
1375 users here now
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a pic of the tweet or similar. No direct links to the tweet.
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
No, it was really a technical issue. Analog signals are very prone to noise, and noise is cumulative. Even the best recording heads are going to pick up stray magnetic fields, and of course you get the typical cosmic ray noise hitting the recording tape and head, and then there's noise in power lines that also contribute to the noise.
Basically, what you don't get to hear anymore causes it: Tune an older radio to somewhere between stations. The static exists all the time. If it didn't, it would just be no noise at all, rather than static. Same with older, analog TVs: You see snow and hear static. That's all environmental noise, which will impact analog recording medium. Even the source side of the house gets that noise introduced. That's what Signal-to-noise ratio means: How much signal, vs how much noise exists.
So, dupe of a dupe of a dupe... All recording noise.
It's also quality of the tape and how we used it. You could buy a tape and use it for 2, 4, or 6 hours with a tradeoff in quality. Blank tapes were rather expensive, so we all used 6 hours and then copied from there.
Commercially produced VHS tapes also tended to be higher quality than blank tapes, unless you went out of your way to buy the quality ones.
And it still happens even with digital technology. If you, say, rotate a .jpg file a few thousand times, the image will start to degrade as it doesn't perfectly copy over everything and the very slight losses start to add up.
Somewhat different issue there. JPEG compression is lossy. It doesn't happen on a BMP. Though you can probably link the two up with underlying information theory.