Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Indiana_Krom
Jun 18, 2007
Net Slacker
When you are using gsync, the vsync toggle in the driver turns into a FPS cap. Enable it and gsync will cap your framerate at the monitors maximum 144 Hz refresh rate so you won't experience any tearing even at high framerates. Disable it and the GPU will run past 144 FPS if it can and you will experience tearing (which may be less visible to you at >144 FPS). I say leave the global vsync toggle enabled for most people on gsync monitors, with the possible exception of competitive twitch shooter players. There is little point to going over the monitors maximum refresh otherwise and it comes with some potential power/heat/noise savings beyond just eliminating tearing.

Also set your windows desktop to 120 Hz and you can enjoy a higher desktop refresh while still having the gpu idle down to standard 100-200 MHz 2d clocks.

Adbot
ADBOT LOVES YOU

Indiana_Krom
Jun 18, 2007
Net Slacker

Shrimp or Shrimps posted:

But do you want to swap to vsync for input lag concerns if FPS exceeds refresh rate? Wouldn't capping your FPS at, say, 143 frames be a better alternative?

(I have no idea).
Not really, capping at 143 would produce pretty much identical results (or 1 frame per second slower actually). On a gsync or a freesync display, input lag at 144 Hz is going to be no more than ~7 MS with or without vsync. There is no fixed refresh interval to miss, if you are off by .1 MS the monitor will simply wait for it. The stacking input lag from a double buffer vsync just doesn't happen anymore thanks to the nature of the variable refresh technologies.

Indiana_Krom
Jun 18, 2007
Net Slacker

Space Racist posted:

Nothing I’m aware of - as mentioned earlier it’s a clean install from a few days ago so there’s not much cruft yet. All I had was the Nvidia driver package, Precision XOC, and I also ran the Nvidia Inspector to try to force the GPU to downclock with multi monitors. I uninstalled Precision, deleted Inspector after disabling the downclock attempt, and then used DDU to remove and reinstall the Nvidia driver package just in case Inspector left anything behind.
I do also have Gigabyte SIV for setting a custom profile for my case fans, and installed Intel XTU and a couple other monitoring programs (GPU-Z, CoreTemp, CPU-Z) also.
At this point I’d be happy if I could just get it to trigger GPU boost in games. No matter what game, it’s stuck at the base clock of 1569 MHz, VDDC of 0.8120 V and the PerfCap as ‘Idle’.
I’m tempted to just wipe and reinstall Windows again, but I’d like to figure this out instead of just taking the nuclear option.

In the nvidia control panel, try toggling "debug mode" in the help menu (it forces the card to reference clocks and voltages, even for factory overclocked cards). I know my 1080 stops idling when you connect more than 2 monitors to it, my primary display is 1080p/240 Hz and my secondary is 1200p/60 Hz. If I connect a third display my card goes to a high idle of like 1100 MHz something, not the full 1600 MHz base clock but much higher than the default 139 MHz idle. 2x 2160p/60 monitors does require more bandwidth to drive though, but it should still boost when called for even if it wasn't idling anymore.

You said its a RMA replacement card? Perhaps examine the bios on it and make sure it hasn't been tampered with (like someone disabled boost and undervolted the card in order to maintain peak efficiency as a mining card). Though last time I checked which was admittedly a LONG time ago it wasn't possible to flash pascal cards with custom bios yet...

Indiana_Krom
Jun 18, 2007
Net Slacker

Aeka 2.0 posted:

I just realized my motherboard has been running my 3000mhz ram at 2100 and I just assumed that's what i bought because my brain is poo poo. Why would it do this when set to auto?

Because auto is the JEDEC standard that guarantees compatibility, to actually get what the manufacturer claims you have to switch it from auto to XMP.

Indiana_Krom
Jun 18, 2007
Net Slacker
I had a factory overclocked evga 980 that was unstable and would lock up every few hours of gaming, causing the driver to reset and make whatever game I was playing crash to the desktop. RMAed it and the replacement was better but still did it at least once every few days. And it wasn't even happening at the highest frequency/power/load, at the time I was using a 60 Hz 1200p display with adaptive vsync, so quite often the card would crash at 70-90% power but would be fine at 100%. However I completely eliminated the issue by loading up MSI afterburner and just dragging the power target and voltage target to the maximum values without applying any further overclock (it boosted to 1.4 GHz out of the box anyway). I loaned the card to a friend and told him to do the same thing, it is still working perfectly fine to this day and never crashes on him.

Basically depending on the silicon lottery the the card may dip into voltages that are too low to be stable at a given boost frequency and it can be fixed by just globally tuning up the voltage.

Indiana_Krom
Jun 18, 2007
Net Slacker
I recently had a Seasonic die on me, G series 360w 80+ gold, at about 4 years or run time (near 24/7, it was powering a pfsense router box that tended to hover at around 40w at the outlet according to the battery backup it is attached to).

Anime Schoolgirl posted:

doesn't the 2080ti draw like 600w at very brief moments

that would do wonders on any PSU that aren't rated at least that wattage

Probably, but harmless. All GPUs and CPUs can cause momentary (less than a millisecond duration) current spikes well into the multiples of the rated TDP, this is what the capacitors are for. The important thing is the average power over a second or two shouldn't exceed the specifications.

Indiana_Krom
Jun 18, 2007
Net Slacker
I went from a 680, to a 980, to a 1080 (and water cooled the 1080), and at the moment I have very little interest in the 2000 series. The only cards fast enough to really be worth the effort to tear everything down and rebuild start at $1200 and that is before before throwing a full cover water block on one. I think I'm going to sit it out till whatever they put out on 7 nm hits, hopefully by then we will see more widespread use of RTX hardware and will have some idea about how it actually performs too.

Indiana_Krom
Jun 18, 2007
Net Slacker
Identifying a CPU limit in a game is trivially easy, disable AA modes and crank the resolution down (like take that upscaling setting and move it to negative 50%). If your FPS barely changes at all you are at a CPU limit. If on the other hand it goes up significantly, then its a GPU limit and you shouldn't worry about the CPU.

Also I don't know if it is the DRM or what, but Ubisoft does suck at this, Far Cry 5 on a 4.8 GHz i7-7700k/GTX1080 is totally CPU limited even at 1080p with a heavy AA mode. The base engine they are using is like 10 years old at this point, you would think they would have ironed out some optimizations by now.

Indiana_Krom
Jun 18, 2007
Net Slacker

TheFluff posted:

Really wasn't planning to spend this Sunday afternoon computer janitoring, but alas, I noticed colors were looking wonky, and sure enough the monitor had got completely stuck in YCbCr 422 mode for no apparent reason whatsoever. Attempting to change back to RGB from the Nvidia control panel was impossible - changing away from "use default color settings" to "use NVIDIA color settings" and hitting apply just immediately went back to default. Doing a clean driver reinstall via Geforce Experience changed jack poo poo. Rebooting multiple times changed jack poo poo. Booting into safe mode did solve it while in safe mode, so it was clearly a driver issue.

According to the internet the generally accepted way of resolving this is to use DDU in safe mode to completely reinstall the drivers, which I did, and it did solve the problem, but what the gently caress? Why? There must be some specific configuration poo poo that can be cleared out in a targeted way rather than reinstalling the entire goddamned thing.

Yeah, I can kind of see why nvidia is still using an expensive FPGA for gsync hdr, there is no point to taping out an ASIC for it in the absence of enough displayport or hdmi bandwidth to drive HDR 4:4:4 at the 144 Hz target refresh rate. With the current 8b/10b encoding used in HDMI and DP, you'd need 60 gbps to deliver 144 Hz HDR. If they ever target 4k HDR 240 Hz they would need over 100 gbps on the cable to pull it off.

Indiana_Krom
Jun 18, 2007
Net Slacker

TheFluff posted:

Even liquid metal is a poor thermal conductor compared to copper - we're talking like 70 W/mK to somewhere in the 3-400 W/mK range. It's great compared to thermal paste though, which tends to be somewhere around 10 W/mK.

More like 5W/mK, the really good stuff can push to 12W/mK.

I have both my (delidded) CPU and my GPU in the same custom cooler loop and it is not uncommon for the CPU to be 20C warmer than the GPU at load. The GPU consumes nearly 2x the power of the CPU, but it is dramatically closer to the coolant than the CPU could ever be.

Also to the person wanting a silent PC, perhaps think about a custom loop cooler? A full cover GPU block requires no additional fans, plus effectively cools the memory/power delivery and you can also cool the CPU in the same loop with only a single superior copper radiator vs having to use multiple AIOs with their aluminum construction. And if you use compression fittings on flexible tubing it is really hard to screw up, if you are competent enough to install an AIO on a previously air cooled GPU going full custom loop is likely well within your skill level. Obviously the one big downside is cost, a custom loop can easily end up at 2x to 3x the cost of a couple AIOs.

Indiana_Krom
Jun 18, 2007
Net Slacker

Xerophyte posted:

It's more that if my radiator fans are pulling air to the point where the closed R6/S2 front is a significant airflow limitation then my radiator fans are almost certainly too drat loud for me to want to keep the case open. I considered the O11 Dynamic since it looks pretty good but it's about as constrained as air goes. The O11 Air is looks ok, the H500M ugly, the Conquer a hideous monstrosity that clashes with absolutely every other piece of furniture I own.

I'll probably go with the V3000 if I can get one, it's a reasonably good looking case and if I'm spending a few hundred bucks on it then it's going to be furniture I like the look of, dammit. Acute angles, greebles, highly saturated colors and whites with a color temperature over 3000K will be strictly banned.

Edit: "COUGAR PANZER - Military-Industrial Design" could be the result of an intentional effort by some nefarious power to find the name and description most likely to make me feel instant, visceral revulsion from googling a PC case. It's possible the problem here is me, but I have chosen to blame the world.
If you want a full tower that isn't some design Darth Vader rejected for being too edgy, have a look at the Corsair Obsidian 750D. You could also grab the optional perforated front panel, but otherwise the case is pretty close to a perfect monolith and can still hold a boatload of water cooling hardware.

Indiana_Krom
Jun 18, 2007
Net Slacker

Paul MaudDib posted:

It's kind of sad that sleek contemporary design is on the way out in favor of the blingee LED cases.

Also, it's virtually impossible nowadays to find something without a window. I tried, there isn't much.
Indeed, at least they let you turn off or otherwise disable all the RGB, I built my machine to be "stealthy", hence:


Actually the side panels on the 750D are interchangeable, so if you order a second/replacement solid panel for it you can turn one into a windowless case. I was tempted to do that on mine but I was already something like $1500 in to the build and decided against spending any further for something that was purely cosmetic.

Indiana_Krom
Jun 18, 2007
Net Slacker

Zero VGS posted:

I have a roll of black gaffer tape for covering up all the various LED at my bedroom desk and it works great with no residue. I guess I can tape across the entire "Nvidia" on the side of the card, but the LEDs will still be lit up and probably leaking light out the side where I can't mask it. I emailed PNY to see if they have any ideas.

Are the LEDs part of the shroud, or part of the PCB? The LEDs on my MSI card were part of the shroud, which means I could simply unplug the header that powered them (and eventually dumped the whole thing for a LED free full cover water block).

Indiana_Krom
Jun 18, 2007
Net Slacker
Frame limiting makes the game engine wait 7 MS between the start of each frame. So it starts a frame, cpu does the simulation and feeds the data to the gpu which renders the frame then scans it out when its done. If less than 7 MS has passed since starting the last frame, it waits till 7 MS has passed and then repeats the process. If more than 7 MS has already passed, then it starts rendering the next frame immediately. The frame rate will never exceed ~141 FPS, so the gpu will always be able to scan out the next frame to the display as soon as it is done, frames never sit in buffers, latency is minimized. The thing is; nvidia put quite a bit of effort into their frame pacing, so when you enable double buffered vsync the nvidia driver will lie to the game about when it is ready to start the next frame and make the game wait till roughly 6.94 MS after the start of the last frame before starting the next one. At that point the only difference is that frame limiting does "wait and then render" while vsync does "render and then wait". The amount of additional latency caused by vsync in that scenario is always going to be trivial and less than 1 refresh interval.

You will only run into latency trouble if you didn't change that one critically important to latency setting in the nvidia control panel / global 3d settings labeled "Maximum pre-rendered frames". It even says in the description this is how many frames ahead the driver will let the CPU queue up, this setting is absolutely a big fat high latency buffer any time your CPU is faster than your GPU or display. If this is anything other than 1, you are setting yourself up for significantly increased input latency. You can even test it yourself, disable gsync, set your refresh rate to 60 Hz, force vsync and then play a game that you could easily break 144 Hz on while adjusting that setting. You can even set vsync to half-refresh adaptive which will let you push the latency of that buffer up to over 133 MS (4 frames ahead at 33.3 MS/frame). Beware that some games also have their own render ahead setting which can and often will be stacked on top of the one the driver does.

Granted, latency flies right out the window when you are using AFR SLI, which requires deeper and therefore higher latency buffers as a basic requirement to the technology functioning at all. At a minimum the latency of AFR SLI is going to be double the latency of a single GPU at the same frame rate, or equal to a single GPU at half the frame rate.

Indiana_Krom fucked around with this message at 15:57 on Dec 15, 2018

Indiana_Krom
Jun 18, 2007
Net Slacker

Craptacular! posted:

I realized this morning that the best feature of 144hz really isn’t 144 FPS, it’s a locked 120 FPS within adaptive sync range and no extraneous bullshit.

240 Hz monitors are even better, pretty much none of that bullshit ever applies because actually exceeding the refresh limit enough for it to matter on one is borderline impossible.

Indiana_Krom
Jun 18, 2007
Net Slacker

Combat Pretzel posted:

I find it a bit hilarious, that Jensen is chastising "uncompliant" FreeSync monitors for blanking and poo poo like that, when not so long ago, their driver was broken for 3-4 releases, making GSync fail exactly the same way.

I've been using a gsync display while always updating my drivers when a new one hit and this never happened to me. Granted this is a sample size of one gsync display and one gpu, maybe I just dodged the bullet?

Indiana_Krom
Jun 18, 2007
Net Slacker

Sininu posted:

CSGO cap is piece of poo poo, I presume Doto2's is as well.

Uncapped:

Capped using ingame method:

RTSS:


Did it ever occur to you that whatever point in the pipeline you are graphing there might be full of poo poo and not actually even remotely connected to when the frames are physically up on the screen? Basically unless you are graphing an output on FCAT, this means absolutely nothing.

Indiana_Krom
Jun 18, 2007
Net Slacker
Is running a full blown ray traced lighting in a modern game actually particularly more expensive than running full blown ray traced lighting in a 22 year old game? The differences in how hard it is to rasterize each one might be enormous, but the ray traced lighting shouldn't actually be that different between a modern game and a two decade old game I would think. They couldn't do this 22 years ago because it was the same amount of work then as it is now and even a super computer of the era fell well short of the level of performance necessary. This quake 2 thing is probably there because its a well known open source that probably wasn't terribly complicated to modify for RTX support. IIRC it has been stated that ray tracing is actually the easier method to implement because when you use it the computer does all the work of making everything look right.

Basically I'm guessing it isn't that it is easier to do on a 22 year old game, it is just that it took 22 years to get hardware fast enough to pull it off period.

Indiana_Krom
Jun 18, 2007
Net Slacker
As much as people like to diss 1080p monitors, I have one of those 24" 1080p @ 240 Hz monitors and its loving awesome, even though I don't play esports games.

Granted, supposedly later this year or early 2020 there should be 1440p/240 Hz monitors on the market, just its going to take a lot more GPU just to break even on FPS compared to 1080p, and hitting 240 Hz on 1080p is already a huge pain in the rear end.

Indiana_Krom
Jun 18, 2007
Net Slacker

tehinternet posted:

Yeah, I feel like with my 1080 Ti pushing 120hZ 1440 Ultrawide I’m good til at least the next generation or the one after that. With an i5-4670k I’m due for a CPU upgrade today over any GPU one.

Please talk me out of buying a i9-9900k please

I already own a 9900k, its very nice. Today I was shuffling some games off my SSD to free up space and steam backup was going super slow ~30 MB/sec because it is single threaded :lol:, so I canceled that and instead fired up 7-zip and started manually compressing them into "fastest" 7z archives on the backup drive which pegged all 16 threads and ran at 145 MB/sec. My old quad core couldn't even come close to that kind of throughput outside of "store" which is not compressing files at all, now 7z is actually significantly faster than my gigabit internet so it is worth it to keep local backups again.

Indiana_Krom
Jun 18, 2007
Net Slacker

Craptacular! posted:

I think Asus Strix can even tie the GPU's fans to your case fans so the case will spin up to cool the card down (likely requires an Asus motherboard as well).
Not exactly, they just put a couple PWM headers on the side of the card that you plug your case fans into (but the case fans also count towards the cards total TDP/power limit).

You can still do it in software with a lot of motherboards by using speedfan to rev up your case fans based on both the CPU and GPU temps. I used to do that before I went full custom water and just slaved everything to the coolant temperature.

Indiana_Krom
Jun 18, 2007
Net Slacker
Yeah, the feel of a game at 180+ fps is absolutely a lot smoother and snappier than at 40 fps, even if its a gsync monitor that does it all without tearing or lag.

Indiana_Krom
Jun 18, 2007
Net Slacker
PC gaming is not dead, and is in no danger of being replaced by consoles or streaming. For one thing, a good single GPU gaming PC can draw and dissipate 400w or more of power, which is well over double what most consoles can get away with. PCs will always be 2-3x faster than consoles purely because of the higher power envelope they allow. Streaming is always going to have a latency penalty that will limit adoption for several popular game types.

Indiana_Krom
Jun 18, 2007
Net Slacker

Gay Retard posted:

Things ARE different now, though. I'd rather play an optimized 4K/60FPS game on a next gen console over a PC that can't reach the same performance. I'm not saying PC gaming will die, but the value proposition will get even worse.

Thing is, on the next gen console said game will be 4k/30 fps cap, and a same generation PC of the time will be doing the same game with higher resolution textures, details, and AA also at 4k and holding somewhere around 90-110 fps. We could get to the point where PCs and consoles have the same basic architecture/OS/etc, just the PC will have 3x more of every resource clocked significantly higher and with a dramatically higher power limit.

Consoles will take over from PCs when they start shipping with 850 watt power supplies and the cooling capacity to dissipate that much heat.

Oh yeah, one more reason PCs are going to stick around: Older games, your new Nth generation console that can do 4k/60 unfortunately can't run any of the hundreds of older games you would also like to run at 4k/60. But your new PC can, or even better it can run them at 4k/144+. I can still play DOS/3.5" floppy era games on my modern desktop PC, and almost any game with directx support can be hacked or patched up to 4k/high Hz. Older games look better and run better on newer PCs than they ever did on the consoles of their day. The old question of "But can it run Crysis?" remains relevant even today, and the answer on a gaming PC is more often than not "Yes." these days.

Indiana_Krom
Jun 18, 2007
Net Slacker

Edmond Dantes posted:

Not sure if this is the correct thread, so apologies in advance.

I just grabbed an ASUS Dual RTX 2060. It's been a while since I changed out a gfx, anything in particular I should do, software/drivers-wise?

I was thinking about running du uninstaller (is that still a thing? is it even needed it I'm swapping a 970 for a 2060 from the same company?).

I also have EVGA PrecisionX running for OCing the 970 and the overlay, was thinking about uninstalling that and installing the one included with the card (GPU tweak 2), or just reinstalling PrecisionX after I swap the card.

Any input is appreciated. Cheers!

You can do all that, or you can just shut down and swap the card. I've done it several times before and never had an issue beyond occasionally needing a second reboot.

Indiana_Krom
Jun 18, 2007
Net Slacker

Edmond Dantes posted:

Hey, I got a couple general (dumb) GPU questions:

I got a 1080p 60hz monitor, nothing fancy here.

1) VSync: Do I want it (no competitive fps stuff, just regular gaming)? Is it the same as limiting framerate?

2) If it's not, should I limit framerate to 60, or should I uncap to give it more... 'overhead' so when things get busy the game doesn't drop below 60?

3) fullscreen vs exclusive fullscreen vs full screen windowed. Is there any actual performance difference on these? I usually go fullscreen window since I have 2 monitors and I tend to alt tab from time to time

3b) Does having two monitor impact performance (GPU-wise)? Second monitor is usually Discord/chrome/YT, maybe a movie from time to time.

4) What are the usual suspects when it comes for the least [performance hit/actual quality improvement] ratio?

That's about it for now, thanks in advance. (I may have dumber follow-up questions)

1) Adaptive vsync will give you the best compromise between performance and quality, but all types of vsync add latency. It is similar but not exactly the same as capping (vsync is a buffer, capping is a throttle).

2) Cap it if you want, but note you can't bank up higher frame rates for when things get busy. If anything, running uncapped makes your computer run hotter and therefore slower.

3) On Windows 7/8.x/10 this is pretty much irrelevant, do whatever works for you. Exclusive full screen might be <1% faster.
3b) No, but what you are running on the second monitor can/will impact performance possibly severely.

4) Varies by game, look up tweaking guides for specific games.

Indiana_Krom
Jun 18, 2007
Net Slacker

Phone posted:

I need to verify when I get home, but did some more research and apparently nvidia’s drivers freak out and put the card into maximum overdrive and gently caress with the fan profile if you have a high refresh rate monitor and a 60Hz monitor plugged in at the same time.

VelociBacon posted:

This is old news but yes if you set your high refresh rate monitor to over 120hz on older Nvidia cards while having a 60hz secondary monitor it idles the card quite a bit higher. I had this problem with my 980ti and it was resolved with my 2080ti.

Had a 144 Hz and a 60 Hz display together on a 980 and did not have a single issue with this. Currently have a 240 Hz and a 60 Hz together on a 1080 also with no such issue. The only time I know for sure you will have a nvidia card fail to idle down is if you plug three displays into a single card (even if they are all 60 Hz).

Indiana_Krom
Jun 18, 2007
Net Slacker
Possible, my combination was 1080p/144 and 1200p/60 which may be close enough to not cause problems.

Indiana_Krom
Jun 18, 2007
Net Slacker

DrDork posted:

1440p@60 / 3440p@100 / 1440p@60 and I've idled my 1080Ti at ~1500Mhz core since day 1. Though between the zero actual load and it being under a AIO, it's silent regardless.

Actual idle on a 1080ti should be like 135 MHz or something in that ballpark, but because you have 3 monitors connected it never actually idles down. If you have an iGPU output available, you could always plug one of the 1440p monitors into it and save yourself a few $ in power every time your computer is on.

Indiana_Krom
Jun 18, 2007
Net Slacker

Statutory Ape posted:

just got around to trying this and no dice :(

I should mention that im running a x1080, x1440, and a 4k monitor (idk if that matters)

thank you for the advice though

Plug whatever monitor doesn't need your main GPU's power into the iGPU if you have one. Nvidia cards do not idle if they have 3 or more displays connected (refresh rate is irrelevant).

Indiana_Krom
Jun 18, 2007
Net Slacker

Anti-Hero posted:

I swapped out my EVGA 2080 XC with an EVGA 2080Ti Black. The 2080Ti refused to display when installed into the PCIE16x slot; it worked fine on the 8x. A motherboard BIOS update fixed that, this was odd and frustrating. After installing the drives I ran the Afterburner OC scanner successfully, but when I try to play any games the PC hard reboots. I've tried this both with the stock GPU settings and the overclock. Seems like a PSU issue, likely overcurrent protection, but I didn't think the supply requirements were much different between a 2080 and a 2080Ti? I've swapped back to the 2080 (with an OC) and things are humming along fine. When swapping the video cards I used DDU, clean installs, etc.

Currently I have a Seasonic 660W Platinum PSU. I didn't think the +12V draw was that much different between a 2080 and 2080Ti that my PSU would cause issues. I'll get a 750W+ unit on order. Does this seem reasonable as a troubleshooting step?

Reasonable, if your PSU has more than two 8 pin PCIe connectors, I would try moving one to a different source as if the PSU has separate 12v rails, it could trip overcurrent if they were both pulling from the same rail.

Indiana_Krom
Jun 18, 2007
Net Slacker

Red_Fred posted:

Yeah it's a Dell U2412M. So just 1920x1200 @ 60Hz.

I just checked and it seems like the EVGA RTX 2070 XC Ultra comes with a HDMI to DVI adapter anyway.

The U2412M has DisplayPort, or at least mine does...

Indiana_Krom
Jun 18, 2007
Net Slacker

LRADIKAL posted:

Please share window location fixes... Dual monitor?
Go into your monitors settings and make sure HDMI/DisplayPort/Etc "deep sleep" or "power saver" modes are disabled. Windows gets pissed and re-arranges everything when the monitor goes to sleep because those modes make the graphics card think the display was physically disconnected instead of just off/sleeping.


Red_Fred posted:

Anyway for the 2070 power connections is it fine to use the same cable for both 8 and 6 pin plug? Or should I run a new cable for one of them from my psi?
Technically you aren't supposed to do that, but in reality the majority of Nvidia cards never draw more power than can be delivered by a single 8 pin plug, so go for it. (A 8 pin plug is good for 150w, a 6 pin is good for 75w, and the PCIe slot itself can deliver another 75w. The RTX 2070 spec has a TDP of 175w, which means the slot and an 8 pin alone can do it with 50w to spare.)

Indiana_Krom
Jun 18, 2007
Net Slacker
I can easily tell whenever my frame rate drops under 80. 100 FPS looks and feels better, but past 144 is diminishing returns outside of very high motion games. Doom 2016 for instance looks and plays incredible at its 200 FPS engine cap if you have a display that can keep up, 60 Hz only looks moderately less smooth overall thanks to a really solid motion blur implementation but the downside is everything is blurred into complete obscurity. You can rapidly turn around in Doom at 200 FPS and also clearly see the imp that was going to attack you from behind, where as at 60 Hz its all a blur and its much more challenging to tell the imp from the rocks behind it.

IMO spacial resolution is a problem that has been largely solved with 1080p on 24", 1440p on 27" and 4k on larger displays. But temporal resolution remains a huge problem, 60 Hz is choppy and makes everything so blurry the resolution becomes irrelevant. I can read text while scrolling on a 240 Hz display fairly easily, on the exact same display at 60 Hz it is a completely unreadable smudge until it stops scrolling, this is an unavoidable problem with sample and hold displays like LCDs. The extra resolution in 4k is completely wasted most of the time you are actually using the display for anything that moves, because the only way to keep the motion "smooth" is to blur the image to the point that even 720p would get the job done. And even if you don't use a motion blur, just being a sample and hold display will blur it to hell at 60 Hz.

One really good way to see the impact of higher refresh rates and strobes is to look at chase camera comparisons on blur busters and their testufo site.

Edit: F%&# page snipe...

Indiana_Krom
Jun 18, 2007
Net Slacker

craig588 posted:

You just happened to say it, not calling you out at all, but I like that we've gotten to the point that now 60 FPS isn't good enough. Advancement of technology!

It isn't actually anything new, I've always hated 60 Hz. There is a reason I kept using CRTs till my last one died from the horizontal deflection failing in like 2012, its because my 19" CRTs did 100 Hz refresh at the 1280x960 resolution I used the most at the time. I was pretty unhappy in the years in-between before 144 Hz + *sync became available in LCDs. Hell, back in the early 2000s the game I was playing competitively looked good enough at 640x480 and I played it there because the particular monitor I was on could refresh at 160 Hz at that resolution, and people wondered how I could react to stuff so fast...

Indiana_Krom
Jun 18, 2007
Net Slacker

Sidesaddle Cavalry posted:

Sounds like we need to get ourselves some better brains :v:

Speaking of the one post from a little bit earlier, whatever happened to that hopeful feature of G/FreeSync and ULMB at the same time? I seem to recall someone doing experiments to prove that variable backlight intensities can be synced to variable refresh rates, but didn't hear anything about commercial adoption of the tech after that....

It wouldn't work, because you don't know when the next frame is going to be delivered so you can't know how bright to make the current frame. Even if you used a fixed strobe interval, the apparent brightness of the screen would vary with the framerate, the more FPS the brighter the screen would appear.

Indiana_Krom
Jun 18, 2007
Net Slacker

K8.0 posted:

That's true for Gsync. You could however do it with Freesync, since Freesync determines the timing for the next frame ahead of time. Unfortunately that's probably quite a ways off.
That's impossible, how can you set the timing for a frame that doesn't exist yet? The whole point of the *sync technologies is that frames aren't always delivered at a fixed swap interval of the display, so they made a monitor that could wait till the frame is actually done and then swap it in even at irregular intervals.

Indiana_Krom
Jun 18, 2007
Net Slacker

Stickman posted:

Could the length of the strobe be varied instead of the brightness? Or would that cause unwanted visual effects?
The length of the strobe basically is the brightness in most cases, the longer it is the brighter it looks, but the downside is the more blur potential there is.

A 2.4 MS strobe is roughly equivalent to running your display at 416 FPS as far as motion blur is concerned. The way ULMB and other strobe modes like it work is by essentially resetting your eyes after every frame, its less controlling what you see and more controlling what you don't see. Sample and hold LCDs induce blur by nature, even if they transitioned from one frame to the next absolutely instantly (0 MS) it would still be blurry on one at 60, 120 or even 240 Hz refresh rate. You have to think about how you see something move on a LCD and how the process actually works. When you move the mouse across the screen the cursor isn't really moving, it is just a series of pictures of the cursor in rapid succession where each new image has it in a different position along a line. But the problem is with sample and hold types, as your eye follows the cursor "moving", your eye is moving at a constant speed across the screen but the images of the cursor aren't actually moving at all, the blur is the disconnect between how much your eye moves while the cursor is stationary during every frame. ULMB works by only very briefly flashing the image and then its black the rest of the time, due to the nature of persistence of vision you only see the flash and the rest of the time you don't see anything, and the flash is so short your eye basically isn't moving relative to the length of time the image is actually there so you don't get the blurring effect.

Indiana_Krom
Jun 18, 2007
Net Slacker

TheFluff posted:

If that rumor about taping out Ampere this week is true (and Ampere is compute only AFAIUI, no consumer parts), then I'd be very surprised if we see the corresponding consumer cards any sooner than a year from now. Summer/fall 2020 seems more likely to me.

At the same time unless 7mn is so much more expensive than 12nm for the same number of transistors that it cancels out the area reduction and corresponding yield increase, it would probably not be a good idea to keep on pumping out these enormous, power hungry and defect prone Turing dies.

Adbot
ADBOT LOVES YOU

Indiana_Krom
Jun 18, 2007
Net Slacker
Direct3d 14?
4D XPoint memory?
April 1st?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply