this post was submitted on 25 Jan 2024
1 points (100.0% liked)

Gaming

3064 readers
4 users here now

!gaming is a community for gaming noobs through gaming aficionados. Unlike !games, we don’t take ourselves quite as serious. Shitposts and memes are welcome.

Our Rules:

1. Keep it civil.


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only.


2. No sexism, racism, homophobia, transphobia or any other flavor of bigotry.


I should not need to explain this one.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Try not to repost anything posted within the past month.


Beyond that, go for it. Not everyone is on every site all the time.



Logo uses joystick by liftarn

founded 1 year ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 9 months ago* (last edited 9 months ago)

I hate this conflation of "Developer" with every other role in modern game development.

If you think the new Porsche looks shit, do you blame the Mecanical engineer who designed the brake mechanism?

If your new manga body pillow gives you a rash, do you blame the graphic designer of the manga?

There is not a single thing listed in the meme above that is actually the fault of the actual developers working on the game. Don't even need to talk about the first picture.

game size is studio management related. They want to stuff as much (repetitive, boring) content into the game as possible. Plus a multiplayer mode no one asked for.

Optimizations don't happen because the CEO decides to take the sales money of the game this quarter, and not next, and ships an unfinished product.

Always online is ALWAYS a management decision.

It's a shit joke, it's wrong because it blames the wrong people, and its also just dumb.

[–] [email protected] 0 points 9 months ago

Man, I miss the good old days.

[–] [email protected] 0 points 9 months ago

I see stuff like this and I don't blame developers/coders for all the shit that's happening. If you objectively look at gameplay and such, most games are actually pretty decent on their own. The graphics are usually really nice and the story is adequate, if not quite good, the controls are sensible and responsive...

A lot of the major complaints about modern games isn't necessarily what the devs are making, it's more about what the garbage company demands is done as part of the whole thing. Online only single player is entirely about control, keeping you from pirating the game (or at least trying to) plus supplying on you and serving you ads and such... Bad releases are because stuff gets pushed out the door before it's ready because the company needs more numbers for their profit reports, so things that haven't been given enough time and need more work get pushed onto paying customers. Day one patches are normal because between the time they seed the game to distributors like valve and Microsoft and stuff, and the time the game unlocks for launch day, stuff is still being actively worked on and fixed.

The large game studios have turned the whole thing into a meat grinder to just pump money out of their customers as much as possible and as often as possible, and they've basically ruined a lot of the simple expectations for game releases, like having a game that works and that performs adequately and doesn't crash or need huge extras (like updates) to work on day 1....

Developers themselves aren't the problem. Studios are the problem and they keep consolidating into a horrible mass of consumer hostile policies.

[–] [email protected] 0 points 9 months ago

games made with agile teams and with passions are probably good, regardless of when they were made. i'm young but growing up i only had access to really old computers and saw that most of the stuff that was made back in the day was just garbage shovelware. it was hard not to get buried in them.

most triple A developers today are far more skilled in both writing and optimizing the code however when the management is forcing you to work long hours you're gonna make more mistakes and with tight deadlines, if you're doing testing and bug fixing after developing the entire game then it's going to be the first thing that's getting cut.

that being said i wish they really did something about the massive size games take on disk. my screen is 1080p, my hardware can barely handle your game on low in 1080p so everything is gonna get downscaled regardless and despite how hard you wanna ignore it data caps are still here, why am i forced to get all assets and textures in 4k 8k? make it optional goddamit.

[–] [email protected] 0 points 9 months ago

This is so true. Also let's not forget where game is almost unplayable and constantly crashing on release.

[–] [email protected] 0 points 9 months ago

For those that are unaware, the second chad is most likely referring to .kkrieger. Not a full game, but a demo (from a demoscene) whose purpose was to make a fully playable game with a max size of 96kb. Even going very slow, you won't need more than 5 minutes to finish it.

The startup is very CPU heavy and takes a while, even on modern systems, because it generates all the geometry, textures, lighting and whatnot from stored procedures.

[–] [email protected] 0 points 9 months ago (1 children)

There used to be a time when game devs wrote their masterpieces using assembly. Now it's all crap Unreal Engine

[–] [email protected] 0 points 9 months ago (2 children)

Whats wrong with Unreal engine? 🤔

[–] [email protected] 0 points 9 months ago (1 children)

Some devs just enable raytracing and make it a requirement, to not care about properly optimized alternative lights and shadows stuff.

[–] [email protected] 0 points 9 months ago (1 children)

Doesn't sound like a game engine problem

[–] [email protected] 0 points 9 months ago (1 children)

Same as using an AI in games is not an AI problem.

[–] [email protected] 0 points 9 months ago (1 children)

Correct. If you build a house with cheap labour and bad materials it's the builders fault. That doesn't make all houses bad and unreliable.

[–] [email protected] 0 points 9 months ago

I mean, if the world makes it very convenient to use such instruments and call the task finished, this is not okay. I wish at some point we would come to conclusion that we need to optimize the code and software products to reduce CO2 emissions or something, so devs' laziness finally becomes less tolerated.

[–] [email protected] 0 points 9 months ago (1 children)

Most devs either don't or can't bother with proper optimization. It's a problem as old as Unreal Engine 3, at least, I remember Unreal Tournament 3 running butter smooth on relatively weak computers, while other games made with UE3 would be choppy and laggy on the same rigs, despite having less graphical clutter.

[–] [email protected] 0 points 9 months ago

That doesn't sound like an engine problem tho

[–] [email protected] 0 points 9 months ago (3 children)

Ok, that got me. I still remember the days of ZX and that funny noise... But I do have a question for one part of the meme: can someone explain to me why on Earth the updates now weigh these tens of gigs? I can accept that hires textures and other assets can take that space, but these are most likely not the bits that are being updated most of the time. Why don't devs just take the code they actually update and send that our way?

[–] [email protected] 0 points 9 months ago (1 children)

For modern games, from what I've seen, they've taken a more modular approach to how assets are saved. So you'll have large data files which are essentially full of compressed textures or something. Depending on how many textures you're using and how many versions of each textures is available (for different detail levels), it can be a lot of assets, even if all the assets in this file, are all wall textures, as an example.

So the problem becomes that the updaters/installers are not complex enough to update a single texture file in a single compressed texture dataset file. So the solution is to instead, replace the entire dataset with one that contains the new information. So while you're adding an item or changing how something looks, you're basically sending not only the item, but also all similar items (all in the same set) again, even though 90% didn't change. The files can easily reach into the 10s of gigabytes in size due to how many assets are needed. Adding a map? Dataset file for all maps needs to be sent. Adding a weapon or changing the look/feel/animation of a weapon? Here's the entire weapon dataset again.

Though not nearly as horrible, the same can be said for the libraries and executable binaries of the game logic. This variable was added, well, here's that entire binary file with the change (not just the change). Binaries tend to be a lot smaller than the assets so it's less problematic.

The entirety of the game content is likely stored in a handful (maybe a few dozen at most) dataset files, so if any one of them change for any reason, end users now need to download 5-10% of the installed size of the game, to get the update.

Is there a better way? Probably. But it may be too complex to accomplish. Basically write a small patching program to unpack the dataset, replace/insert the new assets, then repack it. It would reduce the download size, but increase the amount of work the end user system needs to do for the update, which may or may not be viable depending on the system you've made the game for. PC games should support it, but what happens if you're coding across PC, Xbox, PlayStation, and Nintendo switch? Do those consoles allow your game the read/write access they need to the storage to do the unpacking and repacking? Do they have the space for that?

It becomes a risk, and doing it the way they are now, if you have enough room to download the update, then no more space is needed, since the update manager will simply copy the updated dataset entirely, over the old one.

It's a game of choices and variables, risks and rewards. Developers definitely don't want to get into the business of custom updates per platform based on capabilities, so you have to find a solution that works for everyone who might be running the game. The current solution wastes bandwidth, but has the merit of being cross compatible, and consistent. The process is the same for every platform.

[–] [email protected] 0 points 9 months ago (1 children)

The console argument does actually make a lot of sense to me, thank you for the detailed response. It would still (seemingly) be possible to structure the project in a way that would allow replacing only what you actually need to replace, but that requires more investment in the architecture and likely cause more errors due to added complexity. Still, i cannot forgive the BG 3 coders for making me redownload these 120gb or so! )

[–] [email protected] 0 points 9 months ago

The issue is the compression. There's hundreds of individual assets, the process to compress or more accurately, uncompress the assets for use takes processor resources. Usually it only really needs to be done a few times when the game starts, when it loads the assets required. Basically when you get to a loading screen, the game is unpacking the relevant assets from those dataset files. Every time the game opens one of those datasets, it takes time to create the connection to the dataset file on the host system, then unpack the index of the dataset, and finally go and retrieve the assets needed.

Two things about this process: first, securing access to the file and getting the index is a fairly slow process. Allocating anything takes significant time (relative to the other steps in the process) and accomplishes nothing except preparing to load the relevant assets. It's basically just wasted time. The second thing is that compressed files are most efficient in making the total size smaller when there's more data in the file.

Very basically, the most simple compression, zip (aka "compressed folders" in Windows) basically looks through the files for repeating sections of data, it then replaces all that repeated content with a reference to the original data. The reference is much smaller than the data it replaces. This can also be referred to as de-duplication. In this way if you had a set of files that all contained mostly the same data, say text files with most of the same repeating messages, the resulting compression would be very high (smaller size) and this method is used for things like log files since there are many repeating dates, times, and messages with a few unique variances from line to line. This is an extremely basic concept of one style of compression that's very common, and certainly not the only way, and also not necessarily the method being used, or the only method being used.

If there's less content per compressed dataset file, there's going to be fewer opportunities for the compression to optimize the content to be smaller, so large similar datasets are preferable over smaller ones containing more diverse data.

This, combined with the relatively long open times per file means that programmers will want as few datasets as possible to keep the system from needing to open many files to retrieve the required data during load times, and to boost the efficiency of those compressed files to optimal levels.

If, for example, many smaller files were used, then yes, updates would be smaller. However, loading times could end up being doubled or tripled from their current timing. Given that you would, in theory, be leading data many times over (every time you load into a game or a map or something), compared to how frequently you perform updates, the right choice is to have updates take longer with more data required for download, so when you get into the game, your intra-session loads may be much faster.

With the integration of solid state storage in most modern systems, loading times have also been dramatically reduced due to the sheer speed at which files can be locked, opened, and data streamed out of them into working memory, but it's still a trade-off that needs to be taken into account. This is especially true when considering releases on PC, since PC's can have wildly different hardware and not everyone is using SSDs, or similar (fast) flash storage; perhaps on older systems or if the end user simply prefers the less expensive space available from spinning platter hard disks.

All of this must be counter balanced to provide the best possible experience for the end user and I assure you that all aspects of this process are heavily scrutinized by the people who designed the game. Often, these decisions are made early on so that the rest of the loading system can be designed around these concepts consistently, and it doesn't need to be reworked part way through the lifecycle of the game. It's very likely that even as systems and standards change, the loading system in the game will not, so if the game was designed with optimizations for hard disks (not SSDs) in mind, then that will not change until at least the next major release in that games franchise.

What isn't really excusable is when the next game from a franchise has a large overhaul, and the loading system (with all of its obsolete optimizations) is used for more modern titles; which is something I'm certain happens with most AAA studios. They reuse a lot of the existing systems and code to reduce how much work is required to go from concept to release, and hopefully shorten the duration of time (and the amount of effort required) to get to launch. Such systems should be under scrutiny at all times whenever possible, to further streamline the process and optimize it for the majority of players. If that means outlier customers trying to play the latest game on their WD green spinning disk have a worse time because they haven't purchased an SSD, when more than 90% + have at least a SATA SSD, all of whom get the benefits from the newer load system while obsolete users are detrimented because of their slow platter drives, then so be it.

But I'm starting to cross over into my opinions on it a bit more than I intended to. So I'll stop there. I hope that helps at least make sense of what's happening and why such decisions are made. As always if anyone reads this and knows more than I do, please speak up and correct me. I'm just some guy on the internet, and I'm not perfect. I don't make games, I'm not a developer. I am a systems administrator, so I see these issues constantly; I know how the subsystems work and I have a deep understanding of the underlying technology, but I haven't done any serious coding work for a long long time. I may be wrong or inaccurate on a few points and I welcome any corrections that anyone may have that they can share.

Have a good day.

[–] [email protected] 0 points 9 months ago

The trick is to download the Fitgirl repack. Cheaper on your wallet and your hard drive.

[–] [email protected] 0 points 9 months ago (1 children)

I've got 2gig fiber, not 56k dialup. It's Steam's bandwidth now. They paid Valve their 30%. Why bother with insane compression that just makes it feel slow for us?

[–] [email protected] 0 points 9 months ago

That is also a factor I do not understand. Bandwidth costs the storefront money, would Steam and others not want to decrease this load? And well done you with that fiber, you dog! I also have a fiber line but see no reason to upgrade from my tariff (150mib, i think?) that covers everything just to shave that hour of download time a year.

[–] [email protected] 0 points 9 months ago (1 children)

I’ve written software professionally for two decades and I’m still in awe of the people who used to wring every last drop out of 512kb of memory, a floppy drive and 16 colours on the Amiga 500.

[–] [email protected] 0 points 9 months ago (1 children)

While true that it's impressive, now games have to be made to work on variable screen sizes with different input controllers, key mappings, configurations, more operating systems, with more features than ever. It's an absolute explosion of complexity.

Even making a 2D game for today's hardware is more difficult than making a 2D game for Gameboy.

[–] [email protected] 0 points 9 months ago

Honest question, is that true? It's my understanding that developing a 2D game today would be a simpler task than for a system from the 90s due to so many improvements in development software.

[–] [email protected] 0 points 9 months ago (1 children)

Play Xonotic. I get 200 fps on Intel HD Graphics 2000.

[–] [email protected] 0 points 9 months ago (1 children)

I absolutely love Xonotic! This is my hype song! https://youtu.be/pLKo5TitJm4?si=wDW8dQCpbaraiNZP

YouTube knows what I like.

[–] [email protected] 0 points 9 months ago (1 children)

What server do you play on and when?

[–] [email protected] 0 points 9 months ago

I don't have a computer ATM. I lost everything over fake criminal charges.

[–] [email protected] 0 points 9 months ago* (last edited 9 months ago)

Very rose tinted glasses. I remember horrifying cache corruption bugs that locked you out of certain game areas permanently on that save, random illegal operation exceptions crashing games (no autosave btw), the whole system regularly freezing and needing to be completely restarted, games just inexplicably not working to begin with on a regular basis because of some hardware incompatibility and the internet sucked for finding fixes then and patches weren't a thing so you were just screwed.

I would say that games not all being written in C and assembly trying to squeeze out every possible performance efficiency with nothing but dev machismo as safeguards is in fact a good thing.

[–] [email protected] 0 points 9 months ago (1 children)

Games back then : created by 1 to 4 people with autism because they wanted to have fun on a computer

Games now : driven by dickheads that just left business school at the whims of billionaire conglomoration funds.

[–] [email protected] 0 points 9 months ago (1 children)

I miss when games used to be good. Anyone 'member Vampire Survivors, Lethal Company, Bug Fables? Developers these days just can't compare.

[–] [email protected] 0 points 9 months ago* (last edited 9 months ago) (2 children)

now that's survivor bias

EDIT : here's the fun thing, Lethal company would have been a mod back in the day

[–] [email protected] 0 points 9 months ago (1 children)

Tbf, games were easier to create using in-game functions and logic that was created for another game. Modding a whole rework was easier than making the entire game from scratch. Undeniably lethal company is similar in look and feel but it has better game play than some mods.

[–] [email protected] 0 points 9 months ago* (last edited 9 months ago)

Exactly, creating a mod for half-life or similar titles was simply the easiest way to get a decent working 3d fps engine without coding it yourself.

[–] [email protected] 0 points 9 months ago (1 children)

Is your point that developers today aren't as good/benevolent/whatever as devs back in the day? I'm saying (sarcastically, I suppose) that the same type of developers exist today. What does survivor's bias have to do with it? Is my point moot because GMOD exists?

[–] [email protected] 0 points 9 months ago (1 children)

Your point is moot because there is an unending hose of indie games being created and knowing that 2 gems exist doesn't mean the rest of the cottage industry measures up to the things being achieved earlier, and nor does said indie scene have a similar rate of success as the old industry back then.

[–] [email protected] 0 points 9 months ago

What are you, a shareholder? Why does the 'rate of success' matter? I didn't list three games because there were only two gems.

It's like being at the library and saying "fantasy authors will never compete with what JK Rowling was writing, just look at how many books are here!"

load more comments
view more: next ›