Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Nam Taf
Jun 25, 2005

I am Fat Man, hear me roar!

Malloc Voidstar posted:

You can't play Fallout 4 at 144FPS anyway, physics and FPS are linked.

:psyduck: How does that decision get made and be considered a remotely acceptable solution?

Adbot
ADBOT LOVES YOU

Mr Chips
Jun 27, 2007
Whose arse do I have to blow smoke up to get rid of this baby?
As long as the minimum FPS is >=60, it shouldn't be a huge problem for playability. It might upset the benchmarking crowd.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Nam Taf posted:

:psyduck: How does that decision get made and be considered a remotely acceptable solution?

Console port/considerations, usually. I forget what the lead platform was for Fallout, but Sega's port of Valkyria Chronicles has the same problem

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

Nam Taf posted:

:psyduck: How does that decision get made and be considered a remotely acceptable solution?
It's been in their engine since at least Skyrim and probably Oblivion.
It doesn't affect consoles, most people on PC are at 60FPS, and it's not a directly linear increase. So 30->60 is almost unchanged physics speed, majority of players have no issue.
It probably requires a major engine rewrite to fix.

lewtt
Apr 2, 2011

The Great Twist

Mr Chips posted:

4 x Haswell cores @ 4.5GHz seems an unlikely bottleneck, unless every recent game has suddenly switched to needing more 4 threads. Have you ruled out thermal throttling, background tasks slowing things down, etc etc?

edit: so I did some random googling for benchmarks, it seems that there's no CPU + single GTX980 combo that will give you a minimum of 120fps @ 1920x1080 with everything turned on in Fallout 4.

Well, thanks guys. I didn't actually think 120+ fps in the games I play was unrealistic up until now, so I'm glad I posted in here before an impulse buy. I've spent at least a week of manhours over the past year just googling possible issues with stuff like bios, timings, and throttling. I've probably spent even longer tweaking settings in every single game to try to maintain that fps.

I was able to keep 120fps in fallout 4, but only by using a mod to automatically reduce shadow draw distance to maintain it. For the most part, it worked, though gamebryo engine is still gamebryo. the physics engine linked to framerate was also a major problem for me until I discovered the SetPlayerAIDriven 0 command, which unfucked getting stuck in terminals and having to alt-f4 every time. Still didn't stop a quarter of the mobs from falling through the world or flying into the sky on explosive kill though

I run an aftermarket cpu fan whos name evades me right now, but I rarely go above 60c on my 6970k and 80c on my 980ti, so I don't think heat is the issue. I'm particular about what runs in the background, so besides required windows services, I typically only have steam, msi afterburner, rivatuner, voip, and my antirivus running. I've spent way too long troubleshooting problems unrelated to the actual power of my cpu/gpu/ram/ssd/blahblah, to the point I really can't think of anything that could be causing a steady issue. The only thing I haven't replaced is the motherboard: https://www.asus.com/us/Motherboards/MAXIMUS_VI_HERO/

As for knowing it's the CPU causing the bottleneck, I usually run HWmonitor and MSI afterburner, watching the cpu, gpu, and memory usage. Most games hit 100% cpu utilization, but the poorly optimized ones top out at 50-60%. I was usually able to tell after neutering all GPU-related settings and pumping up draw distance/count what the game's utilization would top out at.

kujeger
Feb 19, 2004

OH YES HA HA
re: fallout 4, i found that having a fps limiter set at 100 removed most of the bugs but still allowing a pretty nice framerate.

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!
Is it even possible to see a difference between say 60 Hz and 144 Hz?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

blowfish posted:

Is it even possible to see a difference between say 60 Hz and 144 Hz?

I would imagine so, certainly. Staring at a CRT that's actually refreshing at 60Hz was always rather painful to me, visibly flickering.

Whether it's worth going crazy and spending money on the best hardware available to consistently exceed 60 FPS in every game, that's a different question entirely.

Valve [for the HTC Vive] seemed to home in on 90 Hz as a reasonable target for VR, which clearly resonated with Oculus, because the Rift is also running a 90 Hz panel.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

blowfish posted:

Is it even possible to see a difference between say 30 Hz and 60 Hz?

The human eye is capable of seeing and noticing absurd amounts of detail, and it's not a computer system where this sort of answer is a firm yes or no in the way you may be thinking. Yes, it is possible to sees the difference. Yes, it is more subtle than the 30-60 leap. But it's noticeable, especially if you're looking for it.

Lovable Luciferian
Jul 10, 2007

Flashing my onyx masonic ring at 5 cent wing n trivia night at Dinglers Sports Bar - Ozma
In some games going above 60 FPS makes the game feel more responsive.

WhyteRyce
Dec 30, 2001

Some professional gamer back in the day described to me degrees per frame if you imagine making lots of rotations and spins in a twitch based shooter

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

blowfish posted:

Is it even possible to see a difference between say 60 Hz and 144 Hz?

Back when we all had CRTs, being able to run your CRT at 85hz refresh (and your game at 85 fps to match) was very noticeable - and the same for when you had a really high end monitor that did 120 hz on a CRT with a beefy computer to keep up. 144 hz you might not see all the difference, but you'll definitely see at least as much of a difference as 85 or 120 vs 60.

The idea that 60 is as far as most people go in being able to really see a difference is a misinterpretation of the fact that 60 hz/fps is a pretty good place to target for your money. Going above it often costs a lot more in processing power and hardware, so it's a pragmatic target to pick.

Same reason that movies have been at 24 FPS for so long - anything much higher meant needing a whole lot more physical film to shoot, edit, and project, and 24 FPS can be ok enough for use. And now we've got an industry where everyone is trained around the limits of 24 FPS and the precise way to light scenes that work well with 24 FPS, so there's gonna be a lot of retraining before they can handle higher framerates properly.

fishmech fucked around with this message at 16:36 on Feb 1, 2016

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

blowfish posted:

Is it even possible to see a difference between say 60 Hz and 144 Hz?

60 to 100hz is very dramatic by itself

HMS Boromir
Jul 16, 2011

by Lowtax

fishmech posted:

Same reason that movies have been at 24 FPS for so long - anything much higher meant needing a whole lot more physical film to shoot, edit, and project, and 24 FPS can be ok enough for use. And now we've got an industry where everyone is trained around the limits of 24 FPS and the precise way to light scenes that work well with 24 FPS, so there's gonna be a lot of retraining before they can handle higher framerates properly.

We've also got a lot of eyeballs trained to watch 24 FPS movies and holy hell does anything higher look wrong. It wouldn't be a big loss for me since I barely watch movies but if anyone manages to hook a new generation onto >24 FPS and it becomes the new standard I might have to stop entirely.

I've never seen refresh rates above 60 but considering how huge the jump from 30 to 60 feels there's gotta be more you can squeeze out of human perception. Contrary to movies, I hope higher refresh rates take off for games so I can save money on parts by sticking to 60 the way I've been doing by staying behind the resolution curve.

HMS Boromir fucked around with this message at 17:23 on Feb 1, 2016

EdEddnEddy
Apr 5, 2012



Once I got a 120Hz 3D monitor, I finally got what everyone who was big on the >60Hz crowd was all up about. While holding a solid 60 in 90% of games is great, being able to go up to 120 and have a silky smooth frame displaying experience is quite nice. Everything just feels even smoother.

30FPS is downright terrible to sit through unless it is some really cinematic experience game or something.

Now the whole 75/90Hz/FPS thing coming with VR is going to make a little more of a challenge. Being able to play some games down to the 50FPS level on a screen was no big deal, but in VR, the FPS has to remain at the VR specified speeds and right now, in high detail games, (like Elite Dangerous) 75 is a bit tough in VR, and 90 is going to be downright hard unless I get a new Pascel powered Nvidia card (or 2) when those beast launch.

It is interesting how when the system is running normally. I can run 120+ FPS without VR, but the minute you are shoved into VR, the performance is almost cut in half (which is to be expected) and as long as there aren't any other bottlenecks (like SteamVR seems to have currently when using the Oculus through it) things work. But when they don't for whatever reason (not currently hardware based) man it destroys the experience and I can imagine it completely turning off some people from VR at least for a generation or two if Steam/Valve/Oculus don't gets things right and let the hardware really off the leash.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

We've also got a lot of eyeballs trained to watch 24 FPS movies and holy hell does anything higher look wrong. It wouldn't be a big loss for me since I barely watch movies but if anyone manages to hook a new generation onto >24 FPS and it becomes the new standard I might have to stop entirely.

Honestly the most likely end result is moves moving to 30 fps and 60 fps, just like HDTV. And there's no reason they can't do it right at high framerate.

I mean 90% the reason the 48 FPS The Hobbit was so bad was that the movie was poorly shot in general, ya know?

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



The human eye doesn't observe in frames per second so the whole idea is ridiculous anyway. Military testing on human response time has shown people reacting to images shown at several hundred frames per second, but there's a big difference between what you can notice/react to and the time it takes the brain to actually process the information being observed, and it's definitely not a matter of "the brain can process all the information it receives as fast as it receives it".

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me
How fast the eye can see things is a fairly complex thing. 24fps is basically what it took to make films stop flickering, but is no where near the maximum. It also depends on what kinds of frames. For example, if you have 100 bright white frames, and a single black frame, you probably won't notice it. If you have 100 black frames and 1 bright white frame, you will definitely notice it.


http://www.100fps.com/how_many_frames_can_humans_see.htm

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

blowfish posted:

Is it even possible to see a difference between say 60 Hz and 144 Hz?
Very much so, yes. Stable 144FPS feels much smoother than 60FPS. Useful in fast FPS games.

Panty Saluter
Jan 17, 2004

Making learning fun!

Lovable Luciferian posted:

In some games going above 60 FPS makes the game feel more responsive.

Frametime is king (shoutout to Pagancow)

I've noticed that too, like if you can maintain 60 but not more the response is still a little sludgy. If you can produce 80-100 even on a 60Hz display it feels a lot better. Maybe it's all power of suggestion but that's how it feels to me.

HMS Boromir
Jul 16, 2011

by Lowtax

fishmech posted:

Honestly the most likely end result is moves moving to 30 fps and 60 fps, just like HDTV. And there's no reason they can't do it right at high framerate.

I mean, my point of comparison is the HDTV my parents got about a year ago, which is the first place I've seen high framerate live action stuff. And boy do I not want to see it ever again.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

I mean, my point of comparison is the HDTV my parents got about a year ago, which is the first place I've seen high framerate live action stuff. And boy do I not want to see it ever again.

Er, the only broadcast standards we're using for HDTV is 30 and 60 FPS, with much of it being 30. These are hardly high framerate, considering as those have been standard rates for TV and monitors for ages.

Are you sure you're not thinking of a TV set that upconverts lower frame-rates to 120 FPS by literally generating new frames based on the average of the frames it's actually receiving? That looks pretty fake, but it's because the actual content is being stretched out with frames that don't actually exist in the source. That sort of stuff is always going to look wonky.

HMS Boromir
Jul 16, 2011

by Lowtax
Hm. Might be that. I've never owned a TV so my only data points are my parents' old one from like 1992 and the new one which is some kind of Samsung 1080p deal. Would I notice a significant difference that feels weird between what would have gotten displayed on the old TV over all these years and an HDTV without the generated inbetween frames?

Panty Saluter
Jan 17, 2004

Making learning fun!

HMS Boromir posted:

I mean, my point of comparison is the HDTV my parents got about a year ago, which is the first place I've seen high framerate live action stuff. And boy do I not want to see it ever again.

Really? I mean I understand that HFR feels weird in a movie but live 60 FPS feeds are amazing.

Toast Museum
Dec 3, 2005

30% Iron Chef
If I understand correctly, a lot of the "soap opera effect" comes from trying to display 24fps content at 60fps. 24 doesn't divide into 60 evenly, so the TV has to create interpolated frames to pad things out. It should be less of a problem with 120Hz displays because 120/24 = 5, so for 24fps content it can just show each frame for five refreshes without doing any interpolation.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

Hm. Might be that. I've never owned a TV so my only data points are my parents' old one from like 1992 and the new one which is some kind of Samsung 1080p deal. Would I notice a significant difference that feels weird between what would have gotten displayed on the old TV over all these years and an HDTV without the generated inbetween frames?

Well a regular old SDTV was 30 frames per second consisting of 60 interlaced fields per second. And your computer monitor since the 80s has rarely been under 60 frames per second (barring off cases like old laptop LCDs that often only did 30 FPS to save battery/power).

Neither 30 nor 60 fps should look weird to you, and 120 fps also shouldn't look weird so long as there's 120 actual frames and it's not interpolated frames being used to beef up 24/30/60 per second to 120. But pretty much the only true 120 fps content out there is video games and some specialty recordings of like, nature documentaries and sports, so it's not exactly common yet.

Toast Museum posted:

If I understand correctly, a lot of the "soap opera effect" comes from trying to display 24fps content at 60fps. 24 doesn't divide into 60 evenly, so the TV has to create interpolated frames to pad things out. It should be less of a problem with 120Hz displays because 120/24 = 5, so for 24fps content it can just show each frame for five refreshes without doing any interpolation.

You've got it kinda backwards. The soap opera effect is that for a long time high budget TV shows were shot on film and did 3:2 pulldown to convert that back to 60 field per second/30 frame per second analog TV. But since soap operas were on really cheap budgets, they'd just be shot first just live and later to cheap-rear end videotape, which means they never got carefully done lighting, editing, etc. And as part of making the best of a bad situation they'd stick to always over-lighting a scene because people will put up with a too bright scene on TV more than a too dark scene, etc.

Many people got so used to high class shows only being 24 fps (reconverted to standard TV rates, but it's still noticeable) that as more shows in the HD era have moved to just shooting on high quality cameras but at full 30 or 60 fps, it "looks like a soap opera" to them. But really it's something that you had to get used to an old way to notice.

Although soap operas still look kinda bad today, because even though they're recorded the same way they still have cut-price sets, lighting, and all the rest.

fishmech fucked around with this message at 18:45 on Feb 1, 2016

HMS Boromir
Jul 16, 2011

by Lowtax
Now that I've been thinking about it, if you'll allow me to be gross for a second, I remember back when webm was first catching on there were a bunch of 60FPS porn clips making the rounds and those looked weird to me too. I think I'm just not used to high framerates for live action video.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

fishmech posted:

I mean 90% the reason the 48 FPS The Hobbit was so bad was that the movie was poorly shot in general, ya know?

Huh, I thought the first Hobbit looked a little off, but thought the effect was great in the second and third, but maybe that's just me..

EdEddnEddy
Apr 5, 2012



fishmech posted:

Er, the only broadcast standards we're using for HDTV is 30 and 60 FPS, with much of it being 30. These are hardly high framerate, considering as those have been standard rates for TV and monitors for ages.

Are you sure you're not thinking of a TV set that upconverts lower frame-rates to 120 FPS by literally generating new frames based on the average of the frames it's actually receiving? That looks pretty fake, but it's because the actual content is being stretched out with frames that don't actually exist in the source. That sort of stuff is always going to look wonky.

That is one of the worst things HDTV's have brought out in years. I believe the only time it is semi good is when watching Sports, but movies are just destroyed by it, and it looks the worst when you watch a Animation either drawn Disney like stuff, or 3D Pixar content. Both are impossible to watch as the blending just does not work in fast movement scenes and such. :barf:

kujeger
Feb 19, 2004

OH YES HA HA
regarding games, how responsive 60/120hz is depends completely on the engine anyway. An engine with say 6 frames of delay between action and reaction is terrible on 60hz but can work okay on 120hz (6 frames of 60hz equals same amount of miliseonds delay as 3 frames of 120hz). An engine with ~1 frame delay will be very responsive on both 60hz and 120hz.

and then you have terrible engines like fallout 4 that are not very responsive even on 120hz, sigh.

kujeger fucked around with this message at 19:34 on Feb 1, 2016

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.

EdEddnEddy posted:

That is one of the worst things HDTV's have brought out in years. I believe the only time it is semi good is when watching Sports, but movies are just destroyed by it, and it looks the worst when you watch a Animation either drawn Disney like stuff, or 3D Pixar content. Both are impossible to watch as the blending just does not work in fast movement scenes and such. :barf:

I don't think you can blame HDTV for that. Later SD CRT televisions already supported 100/120Hz and they created extra frames by interpolation.

Grapeshot
Oct 21, 2010
The motion smoothing on my TV also seemed to make any 24 fps video displayed using a PC horribly jerky, until I named the input that I was using "PC" in the TV settings. That was the only way to turn it off. I think this is caused by the computer doing a 3:2 pulldown and the TV blithely trying to interpolate over the top of that.

Panty Saluter
Jan 17, 2004

Making learning fun!

kujeger posted:

regarding games, how responsive 60/120hz is depends completely on the engine anyway. An engine with say 6 frames of delay between action and reaction is terrible on 60hz but can work okay on 120hz (6 frames of 60hz equals same amount of miliseonds delay as 3 frames of 120hz). An engine with ~1 frame delay will be very responsive on both 60hz and 120hz.

and then you have terrible engines like fallout 4 that are not very responsive even on 120hz, sigh.

Well that explains why Quake 1 still feels amazing compared to a lot of modern games. John Carmack supremacy

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Speaking of John Carmack, Quake 3 also had its physics affected by FPS, specifically you could jump further the higher framerate you could get out of your Voodoo or TNT. Here's a midly :spergin: explanation:
http://www.psycco.de/125fps/UpsetChaps%20Quake3%20Guide%20-%20Why%20Your%20Framerate%20Affects%20Jumping.htm

EdEddnEddy
Apr 5, 2012



Saukkis posted:

I don't think you can blame HDTV for that. Later SD CRT televisions already supported 100/120Hz and they created extra frames by interpolation.

I don't care where it got started or who has it, the option should not be enabled by default and I want to stab whoever though it was going to be the next big marketing gimmick in his left testicle. :argh:



*I don't really want to do this, but I do feel bad (only a little) for anyone who gets a new TV, doesn't really like it but lives with the stupid feature because they are too lazy/ignorant/stupid to figure out how to turn it off...

kujeger
Feb 19, 2004

OH YES HA HA

Panty Saluter posted:

Well that explains why Quake 1 still feels amazing compared to a lot of modern games. John Carmack supremacy

just noticed I switched the numbers a little when I typed it out, but yeah. id (and others like epic) are/were pretty good at this.

6 frames @ 60hz = 100ms
6 frames @ 120hz = 50ms

I remember Carmack posting about VR and latency -- 20ms is about the longest it can take from you move to the screen change hits your eyeballs and still feel good (which includes not only engine processing, but also screen latency, together called "motion-to-photon latency").
The same ballpark probably holds true for responsive games, which means the engine needs to be pretty drat snappy for it to be good, especially with some monitors adding several ms of latency themselves.

CRTs are pretty much 0 latency, while fast LCDs are around <15, fast gaming stuff hitting <5, but non-gaming can be as much as 30+ (here it is usually the signal processing in the lcd that is poo poo, not just slow panels).

but good news! with OLEDs you can get back to almost 0. too bad they cost an arm and a leg for now.

Panty Saluter
Jan 17, 2004

Making learning fun!

kujeger posted:

just noticed I switched the numbers a little when I typed it out, but yeah. id (and others like epic) are/were pretty good at this.

6 frames @ 60hz = 100ms
6 frames @ 120hz = 50ms

I remember Carmack posting about VR and latency -- 20ms is about the longest it can take from you move to the screen change hits your eyeballs and still feel good (which includes not only engine processing, but also screen latency, together called "motion-to-photon latency").
The same ballpark probably holds true for responsive games, which means the engine needs to be pretty drat snappy for it to be good, especially with some monitors adding several ms of latency themselves.

CRTs are pretty much 0 latency, while fast LCDs are around <15, fast gaming stuff hitting <5, but non-gaming can be as much as 30+ (here it is usually the signal processing in the lcd that is poo poo, not just slow panels).

but good news! with OLEDs you can get back to almost 0. too bad they cost an arm and a leg for now.

Yeah, it seems latency has been a concern for Mr. Carmack for some years. I can't fault him for wanting to fix it, high latency is the pits.

An OLED is my next TV for sure. In 3-5 years when they are 1/3 of the price they are now. :v:

TomR
Apr 1, 2003
I both own and operate a pirate ship.
Movies shot on film at 24fps also have their shutters open longer than a TV camera at 60fps. Being open for longer means there is more motion blur. Higher frame rate video is made out of stills with more of the action frozen in each frame. It looks funny and unnatural to see a fast moving object as a series of sharp images rather than a blur.

NLJP
Aug 26, 2004


EdEddnEddy posted:

I don't care where it got started or who has it, the option should not be enabled by default and I want to stab whoever though it was going to be the next big marketing gimmick in his left testicle. :argh:



*I don't really want to do this, but I do feel bad (only a little) for anyone who gets a new TV, doesn't really like it but lives with the stupid feature because they are too lazy/ignorant/stupid to figure out how to turn it off...

Yeah this poo poo is terrible. My parents got a new TV for the first time in over a decade not too long ago and we watched some movies on it over the holidays and I just kept thinking 'this looks like poo poo, way too smoothly weird' and then my brother switched off an obscure option in the main menu (seems to be called something different depending on manufacturer) and suddenly it all looked just fine.

Adbot
ADBOT LOVES YOU

Ham Sandwiches
Jul 7, 2000

Post after post of "Hey guys I tried this new thing that was different than my prior experience and it was WEIRD" about motion smoothing / frame interpolation. Like literally, I went over to my parents house and they had a new tv and it had this newfangled new thing and I DIDNT LIKE IT, but then we turned it off, phew, crisis averted.

Videogames, sports, anything shot live or typically meant for viewing on a TV, not a cinema, will look great when interpolated.

Movies and anything shot at 24 fps to give it a film look with lots of blur looks different, and probably not for the better. It will look like a cam recording of actors doing a play, instead of a movie. So don't use frame interpolation for movies, or set it to a low setting - I have a 'clear' setting on mine that doesn't make movie playback look weird but removes a lot of judder.

quote:

It looks funny and unnatural to see a fast moving object as a series of sharp images rather than a blur.

No, it only looks weird in movies / 24 fps src material. It looks great to have a fast moving object at 120 fps vs 60 for everything else.

By the way, frame interpolation is especially desirable in LCDs because high framerates reduce LCD motion blur. That's why it's on by default in most TVs.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply