|
Meyers-Briggs Testicle posted:what about 'the bug is really funny and doesn't detract from the game'? That brings up some really interesting memories.
|
# ? Feb 6, 2018 23:47 |
|
|
# ? Apr 26, 2024 20:25 |
|
Hyper Crab Tank posted:It would help if you could give an example of the kind of thing you mean, but usually when an issue goes unfixed for a while it's some combination of: I left it somewhat vague just to hear different opinions but I guess I didn't make it clear enough. I wasn't thinking about bugs, but about odd gameplay/balance decisions that may not be apparent to the players. Some things that come to mind: Hearthstone: They have some weird stances like refusing to buff cards that never see play or nerfing cards out of existence rather than making them reasonable. And sometimes they justify those stances with even weirder explanations like " it's not the soul of the card." (TBF they have gotten better about this but it still happens occasionally.) Xenoblade chronicles 2: Having a gacha system on a single-player game with no micro-transactions. Games that sacrifice gameplay for micro-transactions. I can't imagine most designers would want this so I imagine the higher-ups must be pushing those things through. Games that hide too much content behind really hard difficulties. Presumably the designers would want a good amount of people to experience most of the game.
|
# ? Feb 7, 2018 01:21 |
|
Chernabog posted:I left it somewhat vague just to hear different opinions but I guess I didn't make it clear enough. I wasn't thinking about bugs, but about odd gameplay/balance decisions that may not be apparent to the players. For games like Hearthstone, or indeed any game that has to cert to either the Apple Store or one of the console's, it's inordinately expensive to make frequent, small changes. Making the change is one thing, but then you need to QA that change and then cert the build. This is especially true if you are certing to Nintendo, who will come back with problems that you then fix, and then will come back the second time with entirely different, and seemingly arbitrary problems, some which aren't even on their own requirements list... Not that I'm mad. So for any game where you might have wondered, 'why don't they make smaller, incremental updates', this is often the reason why - It's very expensive in raw cash and time. I think the gacha system was targeting the super hard grinders. It's a very traditional inclusion and I'd liken it to the monster arena in FFX. Gameplay vs MTXes - I heard a really neat speech by a designer talking about this very thing, who said something to effect of, 'If you are being forced to put a game together with certain MTX systems and monetization practices, you can still try and succeed in making a fun experience.' I don't think anyone wants to work under aggressive microtransactions, but sometimes thems the breaks, and you can still do your best. I think games that hide content behind difficulty do that intentionally. Games like Furi, Cuphead, or Myst are trying to deliver a specific kind of experience to a player. That experience can be compromised if you let the difficulty slip too much. In the case of a game like DMC, it means you might not be forced to engage with the systems meaningfully and the game it might come off as shallow or other pieces of the design might suffer - it's the difference in experiences between DMC3 and God of War. A game like Myst would suffer a lot more under a simplified model. Myst feels like an ordeal and finishing is an awesome feeling. Making Myst easier means that feeling is compromised as well. Games like The Witness and Braid embrace this differently. The Witness is banking on your figuring out it's thing to really experience the entire game. Remember when people were talking about finding Stars in Braid, and no one really believed them for literal months? It's so bizarre to think that something like that could happen, but because the hiding spots were so esoteric, it opened the door for such a scenario. Symphony of the Night made a bet on half the game that you would piece together the pieces of the real ending. Lots of games gate pieces of their content behind difficulty or puzzles and the experience is stronger when you overcome that. It means that they will inevitably leave some people behind, but I think that's something they decided was worth it.
|
# ? Feb 7, 2018 06:56 |
|
Chernabog posted:I left it somewhat vague just to hear different opinions but I guess I didn't make it clear enough. I wasn't thinking about bugs, but about odd gameplay/balance decisions that may not be apparent to the players. The designers balancing a game with lots of discrete abilities, such as a card game, have different priorities than the players. The Magic team often straight-up states "We don't want every card to be equally powerful. We intentionally make some that are weaker than others." Trying to make everything viable often leads to an arms race where by the time you've buffed up everything that was weak, the things that were strong are now average at best. Similarly, rushing to get fixes into a competitive game where the meta is constantly evolving is often a fool's errand. Maybe your change will expand the options available to players, or maybe it will expose another weakness and collapse the playspace even further. When you're running regular paid-entry tournaments the known but non-optimal situation is often seen as better than the unknown. Monetization and fun are often at odds. Designers usually try to balance this as best as they can, but ultimately they don't keep working unless the game makes money, so they'll often lean too hard on the money side despite their better judgement. Often the justification is that it's possible to relax monetization mechanics post launch but nearly impossible to add them in. There are as many different opinions on difficulty as there are players, and again it's "safer" to shade towards too difficult than too easy. As long as it's not artificially difficult due to obfuscation, controls, or inconsistencies, a hard game presents a challenge to the player, which is generally the point. A game that doesn't require you to explore its mechanics might as well not have any. Oh and also often people make mistakes or just plain don't have time or perspective to make better decisions. Tricky Ed fucked around with this message at 08:24 on Feb 7, 2018 |
# ? Feb 7, 2018 08:15 |
|
I'll second most of what the above posters said and add: Hearthstone: What Trick Ed says is completely true, and thanks to Wizards of the Coast and Mark Rosewater being so open about their process we have a lot of insight into how they think about their game. However, I'm personally of the opinion that Blizzard are nowhere near as good at managing their card game that Wizards is, and Brode is no Rosewater. Frankly I'm as baffled as you are about some of their decisions, and strongly disagree with their priorities. If I were to speculate about what they're thinking, it's some combination of "nerfing popular cards will hurt our bottom line too much" and "we're too busy working on the next set to change old stuff". Plus their patch cycle is horrendously slow due to wanting to have the game on phones, which consistently have awful patch mechanisms. Xenoblade 2: Sounds like it's just a matter of the designers feeling the feature will be enjoyed by a significant enough portion of players that it's worth doing. Maybe you don't fall into that category, but someone else does. MTX: I've never met a designer that explicitly planned for microtransactions as a gameplay feature. Usually it's part of the economic pitch. See, when you're making a game, step two or three is convincing someone with money to give you money so that you can give them money back later. And part of that process is convincing them that the game will make enough money. (Investors/producers are notoriously deaf to the idea of just doing a game for fun or art or whatever.) And there is a lot of money in microtransactions if your game is big and popular enough. That's all it is, the people paying for development wanting to maximize their returns. And usually, the decision isn't "game with MTX" or "game without MTX" - it's "game with MTX" or "no game at all". Difficulty: See the Xenoblade 2 answer above. Hyper Crab Tank fucked around with this message at 08:41 on Feb 7, 2018 |
# ? Feb 7, 2018 08:38 |
|
Chernabog posted:Games that hide too much content behind really hard difficulties. Presumably the designers would want a good amount of people to experience most of the game.
|
# ? Feb 7, 2018 08:48 |
|
Sindai posted:I'm really curious to know what games you're thinking about here. There have been a number of games that gate content off of lower difficulties. I think cuphead did this?
|
# ? Feb 7, 2018 16:33 |
|
Great answers, those are exactly the sort of thing I was thinking about. Most of my design experience is self-taught and with really small teams so it is nice to see what the thought process is like in large studios. As far as games that gate difficulty I was thinking mostly about Souls games, which ironically I can give a pass since I know they are intentionally made tough as nails. But I never understood why they couldn't just add an easier mode with a bit more health or something else that wouldn't detract from the experience of people who want to play it "as intended". Another example would be the old WoW raids that most players never got to see. I guess they were still figuring it out back then because they moved away from that type of stuff.
|
# ? Feb 7, 2018 17:00 |
|
Sindai posted:I'm really curious to know what games you're thinking about here. monkey island timesplitters 2 fire emblem (I think one of them did this) kingdom hearts superman 64 all of those just straight up end early or are shortened if you pick the lowest difficulty world of warcraft adds an extra phase / story stuff onto the hardest difficulty fight
|
# ? Feb 7, 2018 17:01 |
|
Chernabog posted:Great answers, those are exactly the sort of thing I was thinking about. Most of my design experience is self-taught and with really small teams so it is nice to see what the thought process is like in large studios. I'd argue that the Souls games actually do a pretty great job at this. There's a pretty huge range of difficulties you can take the game at--picking a min-Vitality melee character and always fighting everything solo on one end, and then pumping up your health and calling for help a bunch on the other. Non-boss stuff can seem really hard first time you try it, but so much of that difficulty comes from surprises/not knowing things capabilities that it's kind of inevitable that the game gets easier and easier each time you try an area. The stakes for death are also kind of opt-in, difficulty-wise; really tense and hard if you're hoarding souls, but the moment you lose your collection there's basically no downside to death and you get to switch to a much lower-stakes exploration mindset. And there's always grinding, as a last resort. I think Souls games are genius in how they actually let their players subtly set their own difficulty levels, while still making the game feel super intimidating and hard no matter how you play. Counter-point: If your experience was that the game was impenetrably hard and didn't feel there were ways to mitigate that difficulty, then I'm obviously wrong to some extent. I think it's actually pretty common for games that do this to have a really cool range of difficulties you can play at (Spelunky as the big other example that comes to mind), but for the bottom edge of that range to still be high enough for most people to bounce off of.
|
# ? Feb 7, 2018 18:51 |
|
Canine Blues Arooo posted:For games like Hearthstone, or indeed any game that has to cert to either the Apple Store or one of the console's, it's inordinately expensive to make frequent, small changes. Making the change is one thing, but then you need to QA that change and then cert the build. This is especially true if you are certing to Nintendo, who will come back with problems that you then fix, and then will come back the second time with entirely different, and seemingly arbitrary problems, some which aren't even on their own requirements list... Not that I'm mad. So for any game where you might have wondered, 'why don't they make smaller, incremental updates', this is often the reason why - It's very expensive in raw cash and time. Especially for online-only games, don't developers frequently download a configuration for balance at launch or something? i.e. a file of parameters that says "ok the pistol cooldown is 420ms and the uzi does 69 damage today"
|
# ? Feb 7, 2018 22:11 |
|
OtspIII posted:I'd argue that the Souls games actually do a pretty great job at this. There's a pretty huge range of difficulties you can take the game at--picking a min-Vitality melee character and always fighting everything solo on one end, and then pumping up your health and calling for help a bunch on the other. Non-boss stuff can seem really hard first time you try it, but so much of that difficulty comes from surprises/not knowing things capabilities that it's kind of inevitable that the game gets easier and easier each time you try an area. The stakes for death are also kind of opt-in, difficulty-wise; really tense and hard if you're hoarding souls, but the moment you lose your collection there's basically no downside to death and you get to switch to a much lower-stakes exploration mindset. And there's always grinding, as a last resort. Yeah, that's why I can give them a pass. I really liked what I was able to play in Bloodborne but I gave up eventually, after countless deaths and frustration. I even tried to grind experience and call for help but still managed to fail somehow. And then I felt like I missed a big chunk of the game just for arbitrary reasons. Someone spoke about cuphead earlier. I don't know how far you can get with the easy mode but I could totally see someone getting it because they love the art style and then hitting a stonewall pretty fast. That said, one of the things the game has going for it is that whenever you get hit you almost always feel like it was your own fault and not some RNG bullshit. I also like that as long as you pass a level the stars you get don't seem to matter, so there's something for those who want the challenge to ace everything.
|
# ? Feb 8, 2018 18:17 |
|
Canine Blues Arooo posted:For games like Hearthstone, or indeed any game that has to cert to either the Apple Store or one of the console's, it's inordinately expensive to make frequent, small changes. Making the change is one thing, but then you need to QA that change and then cert the build. Hyper Crab Tank posted:Plus their patch cycle is horrendously slow due to wanting to have the game on phones, which consistently have awful patch mechanisms. edit: I suppose I heard horror stories back in the ps3/360 days, so things might have changed since then? hey girl you up fucked around with this message at 05:25 on Feb 9, 2018 |
# ? Feb 9, 2018 04:53 |
|
hey girl you up posted:I've heard horror stories about the console certification process, but I was under the impression Apple/Google were a comparative walk in the park. What's so rough about those stores? Google has no certification process at all, for good and bad. Apple's certificiation process takes maybe a week? It varies a lot. It's not hard to get through. No, the problem is Unity games don't support incremental patching on phones, which means every Hearthstone patch is a giant, multi-gigabyte download. This deters Blizzard specifically from patching this game more frequently. They have occasionally mentioned working on a way to patch card data without needing to go through the app store, but so far that hasn't materialized to my knowledge. This is compounded by the fact that a lot of people don't have unlimited data plans, and typically downloading large updates over 4G is disallowed by the app stores. Necessitating wifi to patch is another hurdle that Blizzard don't want to inflict on users too often. Hyper Crab Tank fucked around with this message at 09:12 on Feb 9, 2018 |
# ? Feb 9, 2018 09:09 |
|
Canine Blues Arooo posted:For games like Hearthstone, or indeed any game that has to cert to either the Apple Store or one of the console's, it's inordinately expensive to make frequent, small changes. Making the change is one thing, but then you need to QA that change and then cert the build. This is especially true if you are certing to Nintendo, who will come back with problems that you then fix, and then will come back the second time with entirely different, and seemingly arbitrary problems, some which aren't even on their own requirements list... Not that I'm mad. So for any game where you might have wondered, 'why don't they make smaller, incremental updates', this is often the reason why - It's very expensive in raw cash and time. This can be true, but there is another reason. When you do game balancing, it takes time for the "Meta To Settle". For example, if you increase the DPS of a popular unit by 20%, then a lot of people will immediately say it's OP and ask for it to be changed, but there are probably a bunch of other changes in the patch which compensate for the DPS rise, which people at first glance aren't really taking into account. You won't actually know that the unit is truly OP for a few weeks, when people have had time to really play with it and to search for units and tactics which can counter it properly. If you release balance patches too frequently, then you'll be basing your balance decisions on incomplete data, and you'll probably end up wasting a bunch of time and doing incorrect things.
|
# ? Feb 9, 2018 10:45 |
|
Gerblyn posted:This can be true, but there is another reason. When you do game balancing, it takes time for the "Meta To Settle". For example, if you increase the DPS of a popular unit by 20%, then a lot of people will immediately say it's OP and ask for it to be changed, but there are probably a bunch of other changes in the patch which compensate for the DPS rise, which people at first glance aren't really taking into account. You won't actually know that the unit is truly OP for a few weeks, when people have had time to really play with it and to search for units and tactics which can counter it properly. Apple forbids code patches to sidestep their cert process. There is a built in system in unity for doing this (asset bundles), but everyone (most.. I’ve seen some cavalier behaviour from Korean and Japanese studios) limits their use of them to data (keeps initial install size down) to avoid the ghost of stebe’s wrath. Separately; balance data, store offers, and the like is usually served over HTTP. Basically, you skirt around the edges as much as you can while avoiding things that you know will get your game booted or will make Apple mad at you to the point of not featuring your game again. Apple’s requirements are also vague and whether you will pass is subject to the phase of moon and time of day.
|
# ? Feb 9, 2018 13:17 |
|
Hyper Crab Tank posted:It would help if you could give an example of the kind of thing you mean One recent example I came across would be how the AI in Civ 6 apparently won't ever take a city, because it priories not taking heavy looses too much. Can that be put down to incompetence, or can the devs here think of some legit reasons why that's not being fixed as a priority that I'm not considering?
|
# ? Feb 9, 2018 14:36 |
|
leper khan posted:Apple forbids code patches to sidestep their cert process. There is a built in system in unity for doing this (asset bundles), but everyone (most.. I’ve seen some cavalier behaviour from Korean and Japanese studios) limits their use of them to data (keeps initial install size down) to avoid the ghost of stebe’s wrath. Sure, I wasn't disagreeing at all, I was just saying there are other reasons to avoid frequent balance patching.
|
# ? Feb 9, 2018 15:20 |
|
Chernabog posted:Hearthstone: They have some weird stances like refusing to buff cards that never see play or nerfing cards out of existence rather than making them reasonable. And sometimes they justify those stances with even weirder explanations like " it's not the soul of the card." (TBF they have gotten better about this but it still happens occasionally.) A lot of things are just not good designs and a fun place for them can't really be found. If something is amazing but only works in specific situations, then more has to be dedicated to producing those situations and/or trying to prevent disadvantageous encounters, so that creates incentives for extremely defensive compositions. Conversely, some things become more effective if you run more of it (usually because there are severe drawbacks to running enough counters to deal with it), so then it's either something that dominates your entire composition, or isn't worth it. Some things are more effective at different skill levels, and a lot of players just don't really care if something is ineffective and just want to play the fantasy (*cough* sniper rifles).
|
# ? Feb 9, 2018 15:21 |
|
With Hearthstone as well, Blizzard has access to huge amounts more information than the players. Just because a player thinks a card is never used or is worthless, doesn't necessarily mean that's the case. It wouldn't surprise me if a Blizzard analyst could say "Actually this card is featured in 14.6% of player sorcerer decks" or "in 57.3% of cases, when this card was played in a match, the player won". I'm not saying that players are always wrong and Blizzard are always right, just that Blizzard know a lot more about the actual statistics of the game than players ever will.
|
# ? Feb 9, 2018 15:31 |
|
Chernabog posted:Games that hide too much content behind really hard difficulties. Presumably the designers would want a good amount of people to experience most of the game. This can be a way of hiding that there's not actually all that much game to experience, as in Superman 64.
|
# ? Feb 9, 2018 16:57 |
|
E. T. Hid landfills worth of content.
|
# ? Feb 9, 2018 17:10 |
|
Avalerion posted:One recent example I came across would be how the AI in Civ 6 apparently won't ever take a city, because it priories not taking heavy looses too much. Can that be put down to incompetence, or can the devs here think of some legit reasons why that's not being fixed as a priority that I'm not considering? The AI definitely conquers cities in Civilization VI. The "problem" is that the AI in that game has different priorities depending on nation, and some nations are really risk averse. And even at the best of times, the Civ AI is really bad at using military units effectively. But the Civilization series has always had bad AI, to the point where it's almost a trope that Firaxis can only be consistently counted on to put out games that 1) are extremely buggy immediately after release, and 2) have lovely AI. As always, the likely answer is that they consider it not bad enough of a problem that it takes priority over working on things that are actually going to earn their next paycheck.
|
# ? Feb 9, 2018 17:12 |
|
"Bad frame pacing" has become a bit of a buzzword lately, popularized by entities like Digital Foundry to describe a phenomenon in which despite running at a high framerate, a game delivers frames at inconsistent intervals, leading to the perceived effect that the game is stuttering or running at a lower framerate than it actually is. Are there any programmer types in this thread who can speak to why a particular game would have this issue, architecturally speaking? Is it a sign of a game that has decoupled its physics timestep from rendering, perhaps?
Lork fucked around with this message at 02:41 on Feb 12, 2018 |
# ? Feb 12, 2018 02:38 |
|
Lork posted:"Bad frame pacing" has become a bit of a buzzword lately, popularized by entities like Digital Foundry to describe a phenomenon in which despite running at a high framerate, a game delivers frames at inconsistent intervals, leading to the perceived effect that the game is stuttering or running at a lower framerate than it actually is. Are there any programmer types in this thread who can speak to why a particular game would have this issue, architecturally speaking? Is it a sign of a game that has decoupled its physics timestep from rendering, perhaps? Inconsistent frame rate is usually caused by garbage collection or varying graphics loads (lots of FX and the like). Decoupling physics timestep from the render timestep has been SOP for a long time, and ensures that any frame rate issues don’t result in altered physics. If they weren’t decoupled, you could spawn explosions or lots of mobs, jump, then vary your jump height by rotating things in/out of view (assuming frustum culling).
|
# ? Feb 12, 2018 03:06 |
|
leper khan posted:Inconsistent frame rate is usually caused by garbage collection or varying graphics loads (lots of FX and the like). Eurogamer on Bloodborne posted:And yet something is awry when playing Bloodborne. Its sub-30fps frame-rate drops may be infrequent, but on close analysis the bigger issue here is in its frame-pacing. As it turns out, From Software's implementation of a 30fps cap means that, as promised, we do get an average refresh at that number near-constantly throughout Yharnam city. The problem? As we've seen with the launch builds of Need for Speed: Rivals and Destiny, an incorrect ordering of frames can cause a nasty stuttering to motion. Eurogamer on Need for Speed Rivals posted:As a result, the tweak minimises a problem on PC that's common to console platforms: the frame-pacing issue. Even on the latest patch version 1.01, this persists throughout the 30fps play on PS4 and Xbox One, just as it did in our preview analysis of the game. In a nutshell, the uneven spread of unique frames creates the sensation of screen judder, making panning camera motions appear to stutter. This is despite the overall frame-rate average sustaining the target 30fps mark. It's a curious quirk of the engine that we see in Frostbite 3 driven Battlefield games on PC too, where the engine occasionally renders two duplicate frames followed by two unique frames to cause a perception of performance being lower than it actually is. quote:Decoupling physics timestep from the render timestep has been SOP for a long time, and ensures that any frame rate issues don’t result in altered physics. If they weren’t decoupled, you could spawn explosions or lots of mobs, jump, then vary your jump height by rotating things in/out of view (assuming frustum culling).
|
# ? Feb 12, 2018 03:36 |
|
Lork posted:"Bad frame pacing" Games that are very animation driven can also cause this phenomenon. The frame rate might be a perfect 60fps, but if the animations contain any kind of pause, delay or sleep timer it can make the game look out of sync. Same is true for games that have hit-delays/pauses on collision events. Such as fighting games. Also poor animations that don't have proper frame counts or animation loops that support your insane frame rate can often skip frames (or tween poorly to fill frames) and make things look strange and non-fluid, which is always mistaken for a performance hit or "lag spike". One project I was hired for had used default blend/tween values for every animation, resulting in the characters look like they were walking in tar/molasses and all the animation timings were way off. They thought it was bad frame rate and once I pointed out the issue and the blend/tweens were properly adjusted everything was buttery smooth. Slayerjerman fucked around with this message at 06:08 on Feb 12, 2018 |
# ? Feb 12, 2018 06:01 |
|
Lork posted:I'm not talking about inconsistent framerates. "Frame pacing issues" as described by Eurogamer refers to a phenomenon where a game appears to stutter despite running at a constant framerate. I'd like to understand this too, because I've been making and playing games for a long time and the phrases "an incorrect ordering of frames" and "produces two unique frames followed by two duplicates - rather than one after another" make no sense to me.
|
# ? Feb 12, 2018 06:35 |
|
Triarii posted:"produces two unique frames followed by two duplicates - rather than one after another" let's say your monitor or TV has a 60hz refresh rate (or, at least, that's what's being targeted). if a game is running at 30fps, the system has to duplicate all of those frames once (to make it 60, the same as the refresh rate) for you to not get flickering. ideally, it's supposed to go: unique frame, duplicate frame, unique frame, duplicate frame. in the case being complained about, it instead goes: unique frame, unique frame, duplicate frame, duplicate frame.
|
# ? Feb 12, 2018 06:41 |
|
Lork posted:I'm not talking about inconsistent framerates. "Frame pacing issues" as described by Eurogamer refers to a phenomenon where a game appears to stutter despite running at a constant framerate. That problem sounds like they messed up a triple buffering implementation. I’m unfamiliar with that engine’s internals though. UE4 has physics substepping. Unity has separate update functions for fixed time. A lot of proprietary engines also implement some sort of decoupling physics updates from frame update.
|
# ? Feb 12, 2018 06:44 |
|
leper khan posted:That problem sounds like they messed up a triple buffering implementation. I’m unfamiliar with that engine’s internals though. I suppose that's not impossible, but this can also be caused simply by the first two frames lasting under 16.67ms each (for 60fps--the sync rate of the TV in this case), plus one frame that didn't come in under budget, which in regular double-buffered vsync will make you spend 33.33ms on the next frame (which, when the tv is syncing at 60hz, will appear as 2 identical frames). There are so many possible causes of this that it's difficult to speculate, but as for why it's more of a problem now than say, 10-15 years ago is just that there's more poo poo happening at any given time in any given game, and it's harder to test all the possible combinations of systems. It's not uncommon at all these days, if you've got a system that you know is too slow to update every frame, to put it on a timer to run every N frames or M seconds. If you plan well, or have a system that can distribute these kind of workloads well behind the scenes, it works great. But all it takes is for one of these systems to blow its budget, or two jobs to get synced up on the same frame, and you'll get a 1-frame spike. If you're just watching the FPS counter, you may not notice it, but it'll be really annoying for people playing the game for a long time.
|
# ? Feb 12, 2018 09:54 |
|
What they're saying is that the games are delivering the expected average 30 frames per second, but those frames are not evenly distributed across that second. Let's try an example. I'll use 50 fps rather than 30 because it makes it easier to understand, but the principle is the same. 50 fps means an average frame time of 20 milliseconds. What you want is for a frame to be presented to the user at t=20ms, 40ms, 60ms, 80ms, 100ms...960ms, 980ms, 1000ms. But you screwed something up, so each individual frame isn't arriving at the right time. The first few frames show up at 20ms, 40ms, 60ms, but the next one comes in at 90ms. Then the next one at 100ms. Then one at 105ms, then the next takes until 130ms to arrive, then the next at 140ms, 160ms, 180ms... What happened there was a kind of micro-judder. You delivered all the frames you were supposed to in the given second; your framerate is a solid 50. But the actual frames didn't arrive evenly spaced, so the end result looks like a jittering mess.
|
# ? Feb 12, 2018 19:26 |
|
From the outside looking in, it seems like any "frames per second" measurement other than 1/(largest frame draw time in a second) (i.e., minimum instantaneous framerate) is not a measurement useful to a viewer. If someone was driving on the highway, sat in stopped traffic for 15 minutes, and drove the remaining 60 miles in 45 minutes, no rational person would say, "You drove a solid 60 miles per hour, my dude, don't see what the issue is."
|
# ? Feb 12, 2018 20:03 |
|
leper khan posted:UE4 has physics substepping. Unity has separate update functions for fixed time. A lot of proprietary engines also implement some sort of decoupling physics updates from frame update. UE4 can run the physics engine in substeps to avoid problems with large delta times causing unexpected behaviour in physx. This is not the same thing as separating the update rate from the render rate. At an architectural level UE4 has almost nothing that separates the game-logic update rate from the render update rate. Objects move, particle systems update, material parameters are changed all on the game thread which sends instantaneous updates to the render thread. And as far as I am aware the render thread has no ability to interpolate between past states. As people have said, the problem of 'frame pacing' is that the average framerate from second to second remains fairly solid while individual frames take longer or shorter times to produce. In my experience the simple reason this happens is that game performance is hard to budget for, and some frames just take longer than other frames. A frame where you spawn a new object or particle system, or have to rebalance some data structure, or where many entities panic and run an instant pathfind or behaviour tree search, takes longer and causes a microstutter.
|
# ? Feb 12, 2018 20:12 |
|
hey girl you up posted:From the outside looking in, it seems like any "frames per second" measurement other than 1/(largest frame draw time in a second) (i.e., minimum instantaneous framerate) is not a measurement useful to a viewer. Before I built tools, I was a hardware analyst, and benchmarking games was a really big part of what I did. Measuring FPS is kind of a mess for the reasons above and more. Generally, a report is expressed as single number, but as block of numbers. The reports I generally sent out were generated from a sample of anywhere from 5 - 30 minutes of play, preferably of the 'repeatable' type (replays, etc.). From there, depending on the product, you might get a data point every second, or every frame, or over some other period. It didn't really matter other than that if the period is too long, the 'spikes' in the data get leveled out at the data-gathering layer and that needs to be considered by the stakeholders. With the data in hand, one could generate a report. The report generally would include actual frame rate over time as a line, several percentile points (5%, 10%, 25%, 90% was pretty normal), mean FPS, standard deviation, and then something to normalize the variance exhibited by the standard deviation when frame rates were higher or lower. A simple version would be 'SampleVariance = Mean / StdDv'. With all that in hand, the stakeholders have to interpret that all and decide what's important and what the definitions for 'acceptable' and 'problematic' are. Studios sometimes have a 'target' framerate for a product given some PC spec and settings. They'll then say that X% of samples need to be at or above that target (usually 90% - 95%). Others just want a general idea and will address issues when they see them. Consoles are different since a 30 of 60 FPS lock is often considered paramount, so in the event of a dip, you might try to trace back exactly where a dip was witnessed and attempt to fix that particular piece of the game to maintain the lock. Measuring FPS is an interesting challenge though since it's very difficult to put hard and fast rules on any part of it and a lot of it ends up being subject to interpretation of the data.
|
# ? Feb 12, 2018 21:10 |
|
The 2b2t minecraft server does a fantastic service to the public by illustrating what a completely laissez-faire multiplayer approach looks like
|
# ? Feb 12, 2018 21:32 |
|
Interesting. It does make a certain amount of sense to me - modern load balancing systems prevent FPS drops in situations where they would've occured in older games, but even they can be overloaded, which influences FPS in a more subtle way. Thanks for the insight.
|
# ? Feb 12, 2018 23:21 |
|
I guess my confusion was because I think of framerate in terms of individual frames - if your target is 60 fps and a frame takes 14ms then you're under budget, if it takes 20ms then you're over - and these reviewers are talking about average framerate over the course of a second. That doesn't seem like a very useful measure to me because you could have the game freeze for 0.75 seconds and then render 60 frames over the remaining 0.25 and that would read as "60 fps" when that's obviously an unplayable mess. I would still just describe this problem as "inconsistent framerate" because all this language about "incorrect ordering of frames," "appears to stutter despite running at a constant framerate," and "producing two unique frames followed by two duplicates" seems to confuse the issue. In the latter case, if you see the same image on screen for two cycles, it's not because the game rendered two duplicate frames - it just didn't render anything for the second frame, so the same image from before is still sitting there.
|
# ? Feb 13, 2018 03:57 |
|
Triarii posted:if you see the same image on screen for two cycles, it's not because the game rendered two duplicate frames - it just didn't render anything for the second frame, so the same image from before is still sitting there.
|
# ? Feb 13, 2018 12:31 |
|
|
# ? Apr 26, 2024 20:25 |
|
In the case where the game is supposed to send "unique frame, duplicate frame, unique frame, duplicate frame" but instead sends "unique frame, unique frame, duplicate frame, duplicate frame," is it sending "unique frame 1, unique frame 2, duplicate of frame 1, duplicate of frame 2" or "unique frame 1, unique frame 2, duplicate of frame 2, duplicate of frame 2"
|
# ? Feb 13, 2018 18:28 |