Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
nothing feels as smooth as windows safe mode on a crt

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

Combat Pretzel posted:

I'd really like a decent and affordable not-OLED 4K120. Apparently we're still graced with mediocre AUO panels only in that regard.

LG is making a 4k144 IPS panel now, as seen in the 27GN950

Don't think anyone has properly reviewed it yet though

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

K8.0 posted:

The first study I ever saw on it was a NASA study involving flight simulators. They found that when they increased the resolution of the ground surface (which granted, is not exactly the same thing as output resolution, but it does correlate) without increasing framerate, pilots actually performed measurably worse in low-altitude maneuvers. There have been a few others studies since that involve the same concept in other ways. It's pretty easy to understand why - at lower resolutions, you're not locking on to fine detail and attempting to track it in motion. As resolution increases at the same real distance moved, you're seeing greater perceptive spatial aliasing. A blob moving one big pixel to the right is much less of an issue for your brain to comprehend as motion than a highly detailed image jumping 50 small pixels to the right, even if the physical size of the motion is the same. I noticed this in action decades ago, long before I could explain it, playing around with lowering resolutions in old FPSs for kicks and noticing how much smoother framerate-locked animations looked and how much less my brain was bothered by them.

I can't be assed to go to the effort of doing it myself, but it would be cool to have a tool like that old goon side by side FPS comparison tool, but instead letting you compare different resolutions at the same framerate.

idk if it's just resolution (i'd think it's more about detail), but this tracks with how i've experienced the transitions between console generations. PS3 and PS4 game are fatiguing in ways that the PS2 wasn't. (lot of reasons why this isn't a scientific measure, eyes aging for one, but i think it really is a thing)

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I kept seeing that model number everywhere. Didn't know it was HFR. Neat! Eventually I want a new display, because one of my XB271HUs has developed weird patterns in the glue after four years.

repiv
Aug 13, 2009

Combat Pretzel posted:

I kept seeing that model number everywhere. Didn't know it was HFR. Neat! Eventually I want a new display, because one of my XB271HUs has developed weird patterns in the glue after four years.

That happened to my XB271HU too, thankfully within warranty and they swapped the panel out

good job auo

Truga
May 4, 2014
Lipstick Apathy

Rinkles posted:

idk if it's just resolution (i'd think it's more about detail), but this tracks with how i've experienced the transitions between console generations. PS3 and PS4 game are fatiguing in ways that the PS2 wasn't.

many ps2 games managed to hit pretty high framerates is why. big jrpgs and poo poo probably ran pretty poorly, but with those it's not such a big deal.

ironically, on ~modern~ consoles this is no longer the case and more games target 30. at 30fps i can maybe see how increasing resolution would be a problem.

at 60-90fps it's whatever tho

e: thinking about that please don't project nasa "simulator study" into anything running at normal framerates because simulators never not run at sub-20fps lol

Truga fucked around with this message at 21:03 on Sep 13, 2020

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Truga posted:

many ps2 games managed to hit pretty high framerates is why. big jrpgs and poo poo probably ran pretty poorly, but with those it's not such a big deal.

ironically, on ~modern~ consoles this is no longer the case and more games target 30. at 30fps i can maybe see how increasing resolution would be a problem.

at 60-90fps it's whatever tho

i was gonna mention that, but i don't think that's the whole story because i played ps3 era games on pc at good framerates and had similar issues. (nothing like the migraines i got from the ps3 version of skyrim, though)

e:though to be clear, even now, there absolutely is a difference between console and pc, as far as my vision issues go

Rinkles fucked around with this message at 21:09 on Sep 13, 2020

FuturePastNow
May 19, 2014


DarthBlingBling posted:

Voodoo5500 deffo had a miles socket for power

ye



I'm sure there were giant pro rendering cards that needed a Molex connector before this.

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum
So I have a ryzen 3600 (I don't think I've overclocked it, I put a noctua cooler on it and goons said that the cooler it is, the better it runs (?) ), a geforce 980, 16gb of ram and a tomahawk max.

I was on the final boss of sekiro trying to get my 1440p 144hz monitor setup to just display as many FPS as possible, in the hopes that it would make me have more of a chance at defeating him (it didn't). I had to use an FPS unlocker. The most I could get was about 100-120fps with like, everything turned down and resolution at 480p. So, does this mean that with my ryzen 3600, I will never be able to get 144fps on games really? I suppose that'll be less of a big deal when I finally get a video card that can actually do g-sync or freesync.

CaptainSarcastic
Jul 6, 2013



redreader posted:

So I have a ryzen 3600 (I don't think I've overclocked it, I put a noctua cooler on it and goons said that the cooler it is, the better it runs (?) ), a geforce 980, 16gb of ram and a tomahawk max.

I was on the final boss of sekiro trying to get my 1440p 144hz monitor setup to just display as many FPS as possible, in the hopes that it would make me have more of a chance at defeating him (it didn't). I had to use an FPS unlocker. The most I could get was about 100-120fps with like, everything turned down and resolution at 480p. So, does this mean that with my ryzen 3600, I will never be able to get 144fps on games really? I suppose that'll be less of a big deal when I finally get a video card that can actually do g-sync or freesync.

Your CPU is fine - your GPU is showing its age, though. https://www.dsogaming.com/pc-performance-analyses/sekiro-shadows-die-twice-pc-performance-analysis/

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Truga posted:

many ps2 games managed to hit pretty high framerates is why. big jrpgs and poo poo probably ran pretty poorly, but with those it's not such a big deal.

Indeed, many PS2 games did target 60FPS. The PS3 and such really struggled with 1080p and either didn't bother rendering internally at that resolution, or targeted 30FPS (or lower), or often times both. While visual quality was considerably higher, actual performance suffered substantially.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

K8.0 posted:

The first study I ever saw on it was a NASA study involving flight simulators.

Do you have a link for this? I tried searching but couldn't find anything even vaguely similar to that, and instead found a bunch of NASA flight sim studies from like 1992.

I mostly ask because, as someone who has spent considerable time utilizing military flight sims, I'm not immediately sold that not increasing framerate is the issue: one of the nasty parts of some of the older flight sims is that ground terrain detail was just detailed enough that you wanted to use it for visual cuing for things like approach speed, height, etc., but the detail wasn't actually sufficient for that, and/or didn't change correctly or at the correct rate, and so you'd get basically suckered in to using inaccurate measures of spatial positioning. Going back a step or two to sims where the detail was clearly insufficient sometimes made it easier because you could simply discard the visual cuing as obviously inaccurate and pay more attention to your instruments.

So yeah, curious what NASA has to say on that.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It looks like this summary was the particular thing I was remembering, although the study with the bit I was referencing is newer than I remembered. Parts of it are relevant to the topic and parts aren't, but it does reference a bunch of material and I found some of it useful the last time I dug into the topic. I know there was some stuff I found through some other sources but I can't remember what right now. I definitely remember some of the most interesting stuff being quite old, I still believe the 70s, having some very interesting testing methods since it obviously predated real-time rendering. How human vision works is a really interesting and broad and deep topic and you can get way the gently caress down the rabbit hole if you keep digging into it.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Cream-of-Plenty posted:

Just put a fuckin AC power jack on the back of the card and plug it into the wall already, you cowards

Idiots can't be trusted to not plug a GPU wall wart into an un-UPSed or non-surge protected outlet.

People might spend $3000 on a ~bitchin' rig~, but they won't spend $150 on a decent 1000-1500VA UPS or even $50 on a pro-grade surge protector.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

AirRaid posted:

In other news, in relation to the new 12 pin connector and the adapter that ships with it, I was clearing out some stuff today and found this -



The more things change...

So I looked it up, and a molex connector is rated for 11A on the 12v pin, so 132W. You can also get 2x6-pin to 1x8-pin connectors. So you can do:

code:
Molex___
        \____ 6-pin ___
Molex___/              \
                        \________ 8 pin____
Molex___                /                  \
        \____ 6-pin ___/                    \
Molex___/                                    \
                                              \____ 12 pin ___ 3090
Molex___                                      /
        \____ 6-pin ___                      /
Molex___/              \                    /
                        \________ 8 pin____/
Molex___                /
        \____ 6-pin ___/
Molex___/                                                                         
And it would (in theory) be able to push up to 1,056W and melt everything in one glorious pile of mistakes and regret! I don't think molex connectors do any sort of power limiting/sensing whatsoever.

CaptainSarcastic
Jul 6, 2013



BIG HEADLINE posted:

Idiots can't be trusted to not plug a GPU wall wart into an un-UPSed or non-surge protected outlet.

People might spend $3000 on a ~bitchin' rig~, but they won't spend $150 on a decent 1000-1500VA UPS or even $50 on a pro-grade surge protector.

In my area of the Pacific Northwest I'm not as concerned about surge protection as I was when I lived in lightning-happy Arizona. Around here I would probably use a surge protector on a theoretical card's power adapter, but wouldn't feel terrible plugging into the wall - the DC converter would almost certainly fail before passing overcurrent to the card, from my understanding, similar to laptop power supplies.


DrDork posted:

So I looked it up, and a molex connector is rated for 11A on the 12v pin, so 132W. You can also get 2x6-pin to 1x8-pin connectors. So you can do:

code:
Molex___
        \____ 6-pin ___
Molex___/              \
                        \________ 8 pin____
Molex___                /                  \
        \____ 6-pin ___/                    \
Molex___/                                    \
                                              \____ 12 pin ___ 3090
Molex___                                      /
        \____ 6-pin ___                      /
Molex___/              \                    /
                        \________ 8 pin____/
Molex___                /
        \____ 6-pin ___/
Molex___/                                                                         
And it would (in theory) be able to push up to 1,056W and melt everything in one glorious pile of mistakes and regret! I don't think molex connectors do any sort of power limiting/sensing whatsoever.

Craptacular!
Jul 9, 2001

Fuck the DH

redreader posted:

I was on the final boss of sekiro trying to get my 1440p 144hz monitor setup to just display as many FPS as possible, in the hopes that it would make me have more of a chance at defeating him (it didn't). I had to use an FPS unlocker. The most I could get was about 100-120fps with like, everything turned down and resolution at 480p. So, does this mean that with my ryzen 3600, I will never be able to get 144fps on games really? I suppose that'll be less of a big deal when I finally get a video card that can actually do g-sync or freesync.

So a couple things:
You don’t really need 120+ FPS on everything. You need to be an expert gamer AND playing certain types of games to make much use at that level. Shooters are the most common example of this For example, I can more easily calculate projectile trajectories on moving targets with 120 FPS than with 60. So I want 100+ FPS for Overwatch, but for Monster Hunter I’m fine with 80 and for some cinematic thing like Assassins Creed I’d rather crank settings and target 60.

The thing about these *sync monitors is, if they’re not poorly made, they let you operate at different target FPS per game and not suffer clipping or judder for it. If you want to play all the games at 1440/144, even with a new card you’re adjusting settings down or spending over a thousand dollars.

Sekiro is made for consoles. While high FPS + *sync is good for timing focused games like that, in that they give you more room for error with decreased lag, the game should be playable at 60 FPS.

I don’t know your monitor, but based on what you said it sounds like it works with both sync so maybe look for a 1660 Super. Buying anything more expensive than that right now is not advisable.

Samadhi
May 13, 2001

Do we know what time on the 17th the 3080 cards are going on sale?

shrike82
Jun 11, 2005

Well it’s official

https://twitter.com/markets/status/1305285968845590528?s=20

Howard Phillips
May 4, 2008

His smile; it shines in the darkest of depths. There is hope yet.

What does this mean? Does Arm own the foundries or just design architecture and license it?

repiv
Aug 13, 2009

someone finally made an rgb pcie riser

https://twitter.com/Toble_Miner/status/1304803447087259648/photo/2

SlayVus
Jul 10, 2009
Grimey Drawer
Nvm

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
The answer to “how fast is it worthwhile to go” has pretty much always been “as fast as you can without making significant compromises in image quality”. 144 Hz self-evidently better, I suspect once 240 Hz 1440p IPS becomes available the benefits will be noticeable.

Pro gamers can probably tell the difference between 240 and 360 Hz. and they will probably be able to tell the difference between 540 Hz after that.

It just isn’t worth it given the other trade-offs involved such as TN panels, limited color space, limited viewing angles, high cost, and extreme PC build requirements. But let’s say 540 Hz 1080p OLED monitors become commonplace - I think even a normal person is going to be able to tell a difference between 144 Hz and 540 Hz on a showroom floor with a PC that is capable of driving it.

The biggest limitation is always going to be what you’re giving up - resolution, limited screen size, ultrawide, etc. Which is why the developments in “middle of the road” 1440p IPS is so important - 1440p 240 Hz IPS or 1440p/1600p 165 Hz ultrawide IPS appeals to a lot more people than 1080p 240 Hz TN or 1080p 240 Hz IPS did.

The improvements are already tapering off though. You really need to go at least 50% faster for most people to notice it. 60 Hz to 100/144 Hz is extremely noticeable, 144 to 240 is less so, 240 to 360 is less so, 360 to 540 will be less so, etc. I think in practical terms 540 hz is probably about where it'll really stop being worth chasing even for pros. It's really just not worth doing rigs that can push 800+ fps for a tiny marginal improvement. But I mean, if we made some breakthrough in computing that massively increased performance to the point where 1000 fps was achievable? , and had superfast gaming OLEDs that could do it? People would keep pushing it.

Paul MaudDib fucked around with this message at 01:01 on Sep 14, 2020

shrike82
Jun 11, 2005

Howard Phillips posted:

What does this mean? Does Arm own the foundries or just design architecture and license it?

Getting access to Arm's talent and IP mainly. Nvidia's main constraint is arguably research talent.

ijyt
Apr 10, 2012

K8.0 posted:

I don't get the obsession with 4k60. 4k60 is not "smooth" or a worthwhile goal for gaming. As resolution goes up, framerate must also increase, or your motion tracking breaks down and things are actually worse than lower resolution. The connection between spatial and temporal resolution has been known since at least the 70s, I'm not sure why it's so hard to get people to accept that it's true but it's pretty easy to experience for yourself. It's the same reason old 3d games were tolerable at 15 FPS - it's not nearly as bad at 320x200 because fewer pixels are being skipped and your brain can deal with it much easier.

lol

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

is this likely to get approved w/o issue

ufarn
May 30, 2009
guess the new grad degree to get isn't machine learning but risc-v

Craptacular!
Jul 9, 2001

Fuck the DH

Paul MaudDib posted:

I think even a normal person is going to be able to tell a difference between 144 Hz and 540 Hz on a showroom floor with a PC that is capable of driving it.

The average person has the world around them refreshing as fast as they can possibly see. So maybe they can see it’s slightly more lifelike in motion, but is that something they’ve got to have? Not likely. No matter what is the consumer-typical frame rate and what is the performance/esports frame rate, at the end of the day people would prefer more details and the former than less details and the latter. It’s only going to apply to games either so old or cranked down so low that you could achieve 500 FPS. Any of us could get 500 FPS out of Unreal Tournament it the original Call of Duty but how many people are seriously playing that.

Sphyre
Jun 14, 2001


I don’t see the problem here. For example as we’ve gone from 480p to 1080p to 4K, the frame rate of movies has also had to increase correspondingly, from 23fps to

shrike82
Jun 11, 2005

Isn't 4K60 a short hand for what's been until recently the highest specced 4K TV/monitors?
I doubt anyone targeting it is aiming for some perceived physiological optimal as opposed to just maxing out their hardware.

We'll see the targets shift when 4K144 becomes readily available.

lol, that's like arguing people are dumb for wanting stuff in 1080P and 4K because humans can perceive much higher resolutions

greasyhands
Oct 28, 2006

Best quality posts,
freshly delivered

Sphyre posted:

I don’t see the problem here. For example as we’ve gone from 480p to 1080p to 4K, the frame rate of movies has also had to increase correspondingly, from 23fps to

Its 24fps and the hobbit movies tried to do 48fps and it was pretty much universally hated

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

greasyhands posted:

Its 24fps and the hobbit movies tried to do 48fps and it was pretty much universally hated

It's also goddamn hell for creators. At higher frame rates you've got to make props and practical effects that much more detailed or they look fake. You also just instantly increased the workload of all of your animators and riggers when it comes to CGI and then doubling the rendering time as you just doubled the amount of frames.

repiv
Aug 13, 2009

Sphyre posted:

I don’t see the problem here. For example as we’ve gone from 480p to 1080p to 4K, the frame rate of movies has also had to increase correspondingly, from 23fps to

movies are different because they have true motion blur to mask the transitions between frames

if you're the sort who doesn't like fake motion blur then games do need increasingly high frame rates to brute force smooth motion

time for the thread to argue about motion blur again :can:

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

jisforjosh posted:

It's also goddamn hell for creators. At higher frame rates you've got to make props and practical effects that much more detailed or they look fake.

Why is that?

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum

Craptacular! posted:

Sekiro is made for consoles. While high FPS + *sync is good for timing focused games like that, in that they give you more room for error with decreased lag, the game should be playable at 60 FPS.

I don’t know your monitor, but based on what you said it sounds like it works with both sync so maybe look for a 1660 Super. Buying anything more expensive than that right now is not advisable.

https://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-vs-Nvidia-GTX-1660S-Super/2576vs4056

A 1660 super is +16% better for 'effective speed'. I realise it'd have g-sync but I was intending on getting a 3070 for the ray-tracing and 2080ti-like performance, not to mention the extra 4gb of vram. A 2080ti is +170% better according to userbenchmark, so a 3070 would be similar.

I have a 'LG 27GL83A-B 27 Inch Ultragear QHD IPS 1ms NVIDIA G-SYNC Compatible Gaming Monitor'. Right now I'm holding off on playing action and fps games because I can run them ok at 1080p but I'd rather play something at full resolution on my new monitor. I can wait a few (4-5) months though, it's no big rush. I don't need to buy a card right now JUST to get g-sync. I have a lot of unplayed non-action games.

Shaocaholica
Oct 29, 2002

Fig. 5E

jisforjosh posted:

It's also goddamn hell for creators. At higher frame rates you've got to make props and practical effects that much more detailed or they look fake. You also just instantly increased the workload of all of your animators and riggers when it comes to CGI and then doubling the rendering time as you just doubled the amount of frames.

Don’t forget James Cameron’s sin of bringing back stereoscopy.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

Rinkles posted:

Why is that?

The lack of motion blur basically. It hides details and helps blur the real elements with the manufactured


From a VFX supervisor


quote:

Set designers, indeed film directors could until now rely on a certain amount of impressionistic leeway. The audience couldn’t see through it. For decades people have been fooled by visual alchemy, quite literally.  Metal they accepted as real actually being wood.  Stone that’s actually cardboard.  Glass that’s actually plastic.  Metal that’s actually plastic.

I’ll never forget the first time I walked directly from a screening room, viewing film dailies from a previous day’s miniature shoot to compare the amazing images I’d just seen on film with what was actually photographed.

The difference was astounding, in this case, a crane shot booming up through a burnt-out cathedral to see the sun flaring out the lens, peaking through a smashed stained-glass window.  If you’re familiar with Interview With the Vampire you might remember the scene.  In actuality the cathedral was maybe 5′ in height, made mostly of painted cardboard and the sun was a lightbulb.  You can forget about that kind of magic with a GH2 much less the 4K future.

The make-up effects industry is already on the down slope.  They’re even less prepared to deal with 4K than the set craftspeople or model makers.  Contemporary make-up effects cannot even stand up to conventional HD photography very well because digital renders rubber as rubber, paint as paint.  It doesn’t look like alien skin.  It doesn’t look like human flesh.  It doesn’t look like anything but what it actually is.


Pilfered Pallbearers
Aug 2, 2007

greasyhands posted:

Its 24fps and the hobbit movies tried to do 48fps and it was pretty much universally hated

Rinkles posted:

Why is that?

Film is archaic as gently caress. Like outside of changes to developing chemicals and lens and poo poo, film (not shot digitally) is still done the same way it was nearly 100 years ago.

In fact, the cameras used for The Hateful Eight were the same cameras used on Ben-Hur (1954).

Even digital cameras attempt to just straight copy the style of actual film, even though they can do totally higher spec stuff like 48fps. Its been around so long, and the industry is so used to its quirks and specifics, that changing that stuff changes the entire dynamic of how a movie is made.

Liken it to making silent films vs audio films.

Craptacular!
Jul 9, 2001

Fuck the DH

redreader posted:

A 1660 super is +16% better for 'effective speed'.
Userbenchmark is bad, unless you’d [url= rather buy an i3 over Threadripper. I wouldn’t use a card with 4GB memory now, even if a strong performer of its generation. I was just trying to solve your problem today, since you might not get a 30-series until next year depending on things we don’t know.

Adbot
ADBOT LOVES YOU

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum

Craptacular! posted:

Userbenchmark is bad, unless you’d [url= rather buy an i3 over Threadripper. I wouldn’t use a card with 4GB memory now, even if a strong performer of its generation. I was just trying to solve your problem today, since you might not get a 30-series until next year depending on things we don’t know.

Right, fair enough! Yeah I'll see how this pans out. If all goes well, I'll manage to get a 3070 from nowinstock.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply