this post was submitted on 14 Aug 2024
-1 points (0.0% liked)
Linux Gaming
15243 readers
97 users here now
Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME
away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.
This page can be subscribed to via RSS.
Original /r/linux_gaming pengwing by uoou.
Resources
WWW:
Discord:
IRC:
Matrix:
Telegram:
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Then what's with all these devastated comments on social media?
You're trolling at this point, right? You have a Boxxy profile picture and yet you're confused about the dynamics of social media?
There's a difference between "can't code" and "can't work".
A lot of people use git for version control: super good idea, basically anything else is at best unorthodox, at worst bizarrely stupid.
A lot of people also use github for repository hosting, continuous integration, code review, deployment, packaging, etc, etc. this is more of an opinion thing than a standard practice thing, and there are plenty of other ways to get the same tools, either all in one package or from a variety of different ones, self hosted, in the cloud, or some hybrid in between.
If GitHub goes down, you can make code changes and everything to your hearts content. But you might not be able to run your full integration testing pipeline on it, get a code review, or package your software.
If your local build process pulls packages from GitHub or refreshes a remote repository automatically, it can also powerfully mess that up, but that's nothing to do with git. You can use "ctrl-c/v" backups and still have a build process that tips over when GitHub goes down.
Under which circumstances would using any VCS that isn’t Git be “bizarrely stupid” and why? I mean, everyone has strong opinions about something, but I’m curious now.
File1, file2, file_3.new, etc would be bizarrely stupid. A home rolled solution involving rsync, tar, gzip, crons or inotify would also be bizarrely stupid.
https://en.wikipedia.org/wiki/List_of_version-control_software anything on that list that's marked anything other than "active" as a more serious answer. So like DCVS, visual source safe, or bitkeeper. Anything that's not getting bug fixes or maintenance.
Anything that doesn't have significant enough usage to give confidence that bugs or glitches are being caught by common usage would be risky, since you don't want to be the person to find that edge case.
There's things other than git that aren't wrong, but I see little compelling reason not to use the most ubiquitous tool.
Ubiquity is not always the most relevant decision. (Especially as most VCS which aren’t Git :-) are easy enough to understand - most of them are even easier than Git in my opinion.)
Of course it's not the only factor, but all things being equal, the most prevalent tool should be preferred.
It depends on the outer circumstances, I think. Using the prevalent tool makes sense in existing environments (which is one of the reasons why many companies use SVN - it worked for them before Git existed and it still works for them, so why not?). For new projects, one-man teams and/or companies starting from scratch, Git might not always be the best choice.
people like to make a fuss and get attention on social media, it's something to talk about, entertainment
they're weren't even down for that long