|
nothing feels as smooth as windows safe mode on a crt
|
# ? Sep 13, 2020 20:48 |
|
|
# ? Apr 29, 2024 13:33 |
|
Combat Pretzel posted:I'd really like a decent and affordable not-OLED 4K120. Apparently we're still graced with mediocre AUO panels only in that regard. LG is making a 4k144 IPS panel now, as seen in the 27GN950 Don't think anyone has properly reviewed it yet though
|
# ? Sep 13, 2020 20:48 |
|
K8.0 posted:The first study I ever saw on it was a NASA study involving flight simulators. They found that when they increased the resolution of the ground surface (which granted, is not exactly the same thing as output resolution, but it does correlate) without increasing framerate, pilots actually performed measurably worse in low-altitude maneuvers. There have been a few others studies since that involve the same concept in other ways. It's pretty easy to understand why - at lower resolutions, you're not locking on to fine detail and attempting to track it in motion. As resolution increases at the same real distance moved, you're seeing greater perceptive spatial aliasing. A blob moving one big pixel to the right is much less of an issue for your brain to comprehend as motion than a highly detailed image jumping 50 small pixels to the right, even if the physical size of the motion is the same. I noticed this in action decades ago, long before I could explain it, playing around with lowering resolutions in old FPSs for kicks and noticing how much smoother framerate-locked animations looked and how much less my brain was bothered by them. idk if it's just resolution (i'd think it's more about detail), but this tracks with how i've experienced the transitions between console generations. PS3 and PS4 game are fatiguing in ways that the PS2 wasn't. (lot of reasons why this isn't a scientific measure, eyes aging for one, but i think it really is a thing)
|
# ? Sep 13, 2020 20:54 |
|
I kept seeing that model number everywhere. Didn't know it was HFR. Neat! Eventually I want a new display, because one of my XB271HUs has developed weird patterns in the glue after four years.
|
# ? Sep 13, 2020 20:54 |
|
Combat Pretzel posted:I kept seeing that model number everywhere. Didn't know it was HFR. Neat! Eventually I want a new display, because one of my XB271HUs has developed weird patterns in the glue after four years. That happened to my XB271HU too, thankfully within warranty and they swapped the panel out good job auo
|
# ? Sep 13, 2020 20:57 |
|
Rinkles posted:idk if it's just resolution (i'd think it's more about detail), but this tracks with how i've experienced the transitions between console generations. PS3 and PS4 game are fatiguing in ways that the PS2 wasn't. many ps2 games managed to hit pretty high framerates is why. big jrpgs and poo poo probably ran pretty poorly, but with those it's not such a big deal. ironically, on ~modern~ consoles this is no longer the case and more games target 30. at 30fps i can maybe see how increasing resolution would be a problem. at 60-90fps it's whatever tho e: thinking about that please don't project nasa "simulator study" into anything running at normal framerates because simulators never not run at sub-20fps lol Truga fucked around with this message at 21:03 on Sep 13, 2020 |
# ? Sep 13, 2020 20:59 |
|
Truga posted:many ps2 games managed to hit pretty high framerates is why. big jrpgs and poo poo probably ran pretty poorly, but with those it's not such a big deal. i was gonna mention that, but i don't think that's the whole story because i played ps3 era games on pc at good framerates and had similar issues. (nothing like the migraines i got from the ps3 version of skyrim, though) e:though to be clear, even now, there absolutely is a difference between console and pc, as far as my vision issues go Rinkles fucked around with this message at 21:09 on Sep 13, 2020 |
# ? Sep 13, 2020 21:07 |
|
DarthBlingBling posted:Voodoo5500 deffo had a miles socket for power ye I'm sure there were giant pro rendering cards that needed a Molex connector before this.
|
# ? Sep 13, 2020 21:30 |
|
So I have a ryzen 3600 (I don't think I've overclocked it, I put a noctua cooler on it and goons said that the cooler it is, the better it runs (?) ), a geforce 980, 16gb of ram and a tomahawk max. I was on the final boss of sekiro trying to get my 1440p 144hz monitor setup to just display as many FPS as possible, in the hopes that it would make me have more of a chance at defeating him (it didn't). I had to use an FPS unlocker. The most I could get was about 100-120fps with like, everything turned down and resolution at 480p. So, does this mean that with my ryzen 3600, I will never be able to get 144fps on games really? I suppose that'll be less of a big deal when I finally get a video card that can actually do g-sync or freesync.
|
# ? Sep 13, 2020 22:38 |
|
redreader posted:So I have a ryzen 3600 (I don't think I've overclocked it, I put a noctua cooler on it and goons said that the cooler it is, the better it runs (?) ), a geforce 980, 16gb of ram and a tomahawk max. Your CPU is fine - your GPU is showing its age, though. https://www.dsogaming.com/pc-performance-analyses/sekiro-shadows-die-twice-pc-performance-analysis/
|
# ? Sep 13, 2020 22:46 |
|
Truga posted:many ps2 games managed to hit pretty high framerates is why. big jrpgs and poo poo probably ran pretty poorly, but with those it's not such a big deal. Indeed, many PS2 games did target 60FPS. The PS3 and such really struggled with 1080p and either didn't bother rendering internally at that resolution, or targeted 30FPS (or lower), or often times both. While visual quality was considerably higher, actual performance suffered substantially.
|
# ? Sep 13, 2020 22:52 |
|
K8.0 posted:The first study I ever saw on it was a NASA study involving flight simulators. Do you have a link for this? I tried searching but couldn't find anything even vaguely similar to that, and instead found a bunch of NASA flight sim studies from like 1992. I mostly ask because, as someone who has spent considerable time utilizing military flight sims, I'm not immediately sold that not increasing framerate is the issue: one of the nasty parts of some of the older flight sims is that ground terrain detail was just detailed enough that you wanted to use it for visual cuing for things like approach speed, height, etc., but the detail wasn't actually sufficient for that, and/or didn't change correctly or at the correct rate, and so you'd get basically suckered in to using inaccurate measures of spatial positioning. Going back a step or two to sims where the detail was clearly insufficient sometimes made it easier because you could simply discard the visual cuing as obviously inaccurate and pay more attention to your instruments. So yeah, curious what NASA has to say on that.
|
# ? Sep 13, 2020 23:03 |
|
It looks like this summary was the particular thing I was remembering, although the study with the bit I was referencing is newer than I remembered. Parts of it are relevant to the topic and parts aren't, but it does reference a bunch of material and I found some of it useful the last time I dug into the topic. I know there was some stuff I found through some other sources but I can't remember what right now. I definitely remember some of the most interesting stuff being quite old, I still believe the 70s, having some very interesting testing methods since it obviously predated real-time rendering. How human vision works is a really interesting and broad and deep topic and you can get way the gently caress down the rabbit hole if you keep digging into it.
|
# ? Sep 13, 2020 23:47 |
|
Cream-of-Plenty posted:Just put a fuckin AC power jack on the back of the card and plug it into the wall already, you cowards Idiots can't be trusted to not plug a GPU wall wart into an un-UPSed or non-surge protected outlet. People might spend $3000 on a ~bitchin' rig~, but they won't spend $150 on a decent 1000-1500VA UPS or even $50 on a pro-grade surge protector.
|
# ? Sep 13, 2020 23:49 |
|
AirRaid posted:In other news, in relation to the new 12 pin connector and the adapter that ships with it, I was clearing out some stuff today and found this - So I looked it up, and a molex connector is rated for 11A on the 12v pin, so 132W. You can also get 2x6-pin to 1x8-pin connectors. So you can do: code:
|
# ? Sep 14, 2020 00:11 |
|
BIG HEADLINE posted:Idiots can't be trusted to not plug a GPU wall wart into an un-UPSed or non-surge protected outlet. In my area of the Pacific Northwest I'm not as concerned about surge protection as I was when I lived in lightning-happy Arizona. Around here I would probably use a surge protector on a theoretical card's power adapter, but wouldn't feel terrible plugging into the wall - the DC converter would almost certainly fail before passing overcurrent to the card, from my understanding, similar to laptop power supplies. DrDork posted:So I looked it up, and a molex connector is rated for 11A on the 12v pin, so 132W. You can also get 2x6-pin to 1x8-pin connectors. So you can do:
|
# ? Sep 14, 2020 00:17 |
|
redreader posted:I was on the final boss of sekiro trying to get my 1440p 144hz monitor setup to just display as many FPS as possible, in the hopes that it would make me have more of a chance at defeating him (it didn't). I had to use an FPS unlocker. The most I could get was about 100-120fps with like, everything turned down and resolution at 480p. So, does this mean that with my ryzen 3600, I will never be able to get 144fps on games really? I suppose that'll be less of a big deal when I finally get a video card that can actually do g-sync or freesync. So a couple things: You don’t really need 120+ FPS on everything. You need to be an expert gamer AND playing certain types of games to make much use at that level. Shooters are the most common example of this For example, I can more easily calculate projectile trajectories on moving targets with 120 FPS than with 60. So I want 100+ FPS for Overwatch, but for Monster Hunter I’m fine with 80 and for some cinematic thing like Assassins Creed I’d rather crank settings and target 60. The thing about these *sync monitors is, if they’re not poorly made, they let you operate at different target FPS per game and not suffer clipping or judder for it. If you want to play all the games at 1440/144, even with a new card you’re adjusting settings down or spending over a thousand dollars. Sekiro is made for consoles. While high FPS + *sync is good for timing focused games like that, in that they give you more room for error with decreased lag, the game should be playable at 60 FPS. I don’t know your monitor, but based on what you said it sounds like it works with both sync so maybe look for a 1660 Super. Buying anything more expensive than that right now is not advisable.
|
# ? Sep 14, 2020 00:25 |
|
Do we know what time on the 17th the 3080 cards are going on sale?
|
# ? Sep 14, 2020 00:26 |
|
Well it’s official https://twitter.com/markets/status/1305285968845590528?s=20
|
# ? Sep 14, 2020 00:32 |
|
shrike82 posted:Well it’s official What does this mean? Does Arm own the foundries or just design architecture and license it?
|
# ? Sep 14, 2020 00:35 |
|
someone finally made an rgb pcie riser https://twitter.com/Toble_Miner/status/1304803447087259648/photo/2
|
# ? Sep 14, 2020 00:38 |
|
Nvm
|
# ? Sep 14, 2020 00:40 |
|
The answer to “how fast is it worthwhile to go” has pretty much always been “as fast as you can without making significant compromises in image quality”. 144 Hz self-evidently better, I suspect once 240 Hz 1440p IPS becomes available the benefits will be noticeable. Pro gamers can probably tell the difference between 240 and 360 Hz. and they will probably be able to tell the difference between 540 Hz after that. It just isn’t worth it given the other trade-offs involved such as TN panels, limited color space, limited viewing angles, high cost, and extreme PC build requirements. But let’s say 540 Hz 1080p OLED monitors become commonplace - I think even a normal person is going to be able to tell a difference between 144 Hz and 540 Hz on a showroom floor with a PC that is capable of driving it. The biggest limitation is always going to be what you’re giving up - resolution, limited screen size, ultrawide, etc. Which is why the developments in “middle of the road” 1440p IPS is so important - 1440p 240 Hz IPS or 1440p/1600p 165 Hz ultrawide IPS appeals to a lot more people than 1080p 240 Hz TN or 1080p 240 Hz IPS did. The improvements are already tapering off though. You really need to go at least 50% faster for most people to notice it. 60 Hz to 100/144 Hz is extremely noticeable, 144 to 240 is less so, 240 to 360 is less so, 360 to 540 will be less so, etc. I think in practical terms 540 hz is probably about where it'll really stop being worth chasing even for pros. It's really just not worth doing rigs that can push 800+ fps for a tiny marginal improvement. But I mean, if we made some breakthrough in computing that massively increased performance to the point where 1000 fps was achievable? , and had superfast gaming OLEDs that could do it? People would keep pushing it. Paul MaudDib fucked around with this message at 01:01 on Sep 14, 2020 |
# ? Sep 14, 2020 00:40 |
|
Howard Phillips posted:What does this mean? Does Arm own the foundries or just design architecture and license it? Getting access to Arm's talent and IP mainly. Nvidia's main constraint is arguably research talent.
|
# ? Sep 14, 2020 00:56 |
|
K8.0 posted:I don't get the obsession with 4k60. 4k60 is not "smooth" or a worthwhile goal for gaming. As resolution goes up, framerate must also increase, or your motion tracking breaks down and things are actually worse than lower resolution. The connection between spatial and temporal resolution has been known since at least the 70s, I'm not sure why it's so hard to get people to accept that it's true but it's pretty easy to experience for yourself. It's the same reason old 3d games were tolerable at 15 FPS - it's not nearly as bad at 320x200 because fewer pixels are being skipped and your brain can deal with it much easier. lol
|
# ? Sep 14, 2020 01:01 |
|
shrike82 posted:Well it’s official is this likely to get approved w/o issue
|
# ? Sep 14, 2020 01:03 |
|
guess the new grad degree to get isn't machine learning but risc-v
|
# ? Sep 14, 2020 01:03 |
|
Paul MaudDib posted:I think even a normal person is going to be able to tell a difference between 144 Hz and 540 Hz on a showroom floor with a PC that is capable of driving it. The average person has the world around them refreshing as fast as they can possibly see. So maybe they can see it’s slightly more lifelike in motion, but is that something they’ve got to have? Not likely. No matter what is the consumer-typical frame rate and what is the performance/esports frame rate, at the end of the day people would prefer more details and the former than less details and the latter. It’s only going to apply to games either so old or cranked down so low that you could achieve 500 FPS. Any of us could get 500 FPS out of Unreal Tournament it the original Call of Duty but how many people are seriously playing that.
|
# ? Sep 14, 2020 01:03 |
|
ijyt posted:lol I don’t see the problem here. For example as we’ve gone from 480p to 1080p to 4K, the frame rate of movies has also had to increase correspondingly, from 23fps to
|
# ? Sep 14, 2020 01:06 |
|
Isn't 4K60 a short hand for what's been until recently the highest specced 4K TV/monitors? I doubt anyone targeting it is aiming for some perceived physiological optimal as opposed to just maxing out their hardware. We'll see the targets shift when 4K144 becomes readily available. lol, that's like arguing people are dumb for wanting stuff in 1080P and 4K because humans can perceive much higher resolutions
|
# ? Sep 14, 2020 01:08 |
|
Sphyre posted:I don’t see the problem here. For example as we’ve gone from 480p to 1080p to 4K, the frame rate of movies has also had to increase correspondingly, from 23fps to Its 24fps and the hobbit movies tried to do 48fps and it was pretty much universally hated
|
# ? Sep 14, 2020 01:26 |
|
greasyhands posted:Its 24fps and the hobbit movies tried to do 48fps and it was pretty much universally hated It's also goddamn hell for creators. At higher frame rates you've got to make props and practical effects that much more detailed or they look fake. You also just instantly increased the workload of all of your animators and riggers when it comes to CGI and then doubling the rendering time as you just doubled the amount of frames.
|
# ? Sep 14, 2020 01:30 |
|
Sphyre posted:I don’t see the problem here. For example as we’ve gone from 480p to 1080p to 4K, the frame rate of movies has also had to increase correspondingly, from 23fps to movies are different because they have true motion blur to mask the transitions between frames if you're the sort who doesn't like fake motion blur then games do need increasingly high frame rates to brute force smooth motion time for the thread to argue about motion blur again
|
# ? Sep 14, 2020 01:31 |
|
jisforjosh posted:It's also goddamn hell for creators. At higher frame rates you've got to make props and practical effects that much more detailed or they look fake. Why is that?
|
# ? Sep 14, 2020 01:36 |
|
Craptacular! posted:Sekiro is made for consoles. While high FPS + *sync is good for timing focused games like that, in that they give you more room for error with decreased lag, the game should be playable at 60 FPS. https://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-vs-Nvidia-GTX-1660S-Super/2576vs4056 A 1660 super is +16% better for 'effective speed'. I realise it'd have g-sync but I was intending on getting a 3070 for the ray-tracing and 2080ti-like performance, not to mention the extra 4gb of vram. A 2080ti is +170% better according to userbenchmark, so a 3070 would be similar. I have a 'LG 27GL83A-B 27 Inch Ultragear QHD IPS 1ms NVIDIA G-SYNC Compatible Gaming Monitor'. Right now I'm holding off on playing action and fps games because I can run them ok at 1080p but I'd rather play something at full resolution on my new monitor. I can wait a few (4-5) months though, it's no big rush. I don't need to buy a card right now JUST to get g-sync. I have a lot of unplayed non-action games.
|
# ? Sep 14, 2020 01:38 |
|
jisforjosh posted:It's also goddamn hell for creators. At higher frame rates you've got to make props and practical effects that much more detailed or they look fake. You also just instantly increased the workload of all of your animators and riggers when it comes to CGI and then doubling the rendering time as you just doubled the amount of frames. Don’t forget James Cameron’s sin of bringing back stereoscopy.
|
# ? Sep 14, 2020 01:45 |
|
Rinkles posted:Why is that? The lack of motion blur basically. It hides details and helps blur the real elements with the manufactured From a VFX supervisor quote:Set designers, indeed film directors could until now rely on a certain amount of impressionistic leeway. The audience couldn’t see through it. For decades people have been fooled by visual alchemy, quite literally. Metal they accepted as real actually being wood. Stone that’s actually cardboard. Glass that’s actually plastic. Metal that’s actually plastic.
|
# ? Sep 14, 2020 01:45 |
|
greasyhands posted:Its 24fps and the hobbit movies tried to do 48fps and it was pretty much universally hated Rinkles posted:Why is that? Film is archaic as gently caress. Like outside of changes to developing chemicals and lens and poo poo, film (not shot digitally) is still done the same way it was nearly 100 years ago. In fact, the cameras used for The Hateful Eight were the same cameras used on Ben-Hur (1954). Even digital cameras attempt to just straight copy the style of actual film, even though they can do totally higher spec stuff like 48fps. Its been around so long, and the industry is so used to its quirks and specifics, that changing that stuff changes the entire dynamic of how a movie is made. Liken it to making silent films vs audio films.
|
# ? Sep 14, 2020 01:58 |
|
redreader posted:A 1660 super is +16% better for 'effective speed'.
|
# ? Sep 14, 2020 02:05 |
|
|
# ? Apr 29, 2024 13:33 |
|
Craptacular! posted:Userbenchmark is bad, unless you’d [url= rather buy an i3 over Threadripper. I wouldn’t use a card with 4GB memory now, even if a strong performer of its generation. I was just trying to solve your problem today, since you might not get a 30-series until next year depending on things we don’t know. Right, fair enough! Yeah I'll see how this pans out. If all goes well, I'll manage to get a 3070 from nowinstock.
|
# ? Sep 14, 2020 02:07 |