Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
fozzy fosbourne
Apr 21, 2010

Skuto posted:

This is an oft repeated myth. Frame capping still adds input lag (as compared to rendering more frames when the card is capable of it).

From what I've seen, it's very small, especially if we're talking about capping at 140 hz. When this came up last time, I referenced a technical discussion from somewhere where it was measured to be a ms or 2. It's a fraction of a frame rather than a whole frame, right?

Adbot
ADBOT LOVES YOU

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

averox posted:

I keep debating on whether I should spring for a FTW or Strix OC because that's what I had my heart set on but keep going back on the fact that this Gigabyte 1080 G1 should be just fine. But, I mean, maaaybee.

Customer support is probably the only real difference between the two, with EVGA being known as an amazing company to deal with. It looks like all the cards perform roughly the same when it comes to overclocks and whatnot.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

Atoramos posted:

Cool, signed up for auto-notification on newegg and evga's site, any other recommended ways to snag one when they're available?

If your experience is similar to mine, you're never gonna get a notification from newegg. They sell out too quickly. Setup alerts on nowinstock.net, that's how I was able to snag my MSI 1080.

Rabid Snake
Aug 6, 2004



For some reason, when I play CS GO on my GTX 1070, I get high FPS but the game does not run smoothly. It feels like I'm gaming with a 30hz monitor. I googled the problem and people were saying it's Shadowplay. I turned off shadowplay but I still get this problem :(

It sucks because DOOM and Overwatch run perfectly smooth. But for some reason CS GO makes my monitor feel like a stereoscope. Is there any other things I should check to fix this problem?

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.
[Edit: For some reason I forgot that frame capping can be done with vsync off, making my entire post BS]

Hiowf fucked around with this message at 19:53 on Jun 30, 2016

penus penus penus
Nov 9, 2014

by piss__donald

Skuto posted:

Once you hit max FPS, input lag is necessarily less than a frame (retarded game engine implementations notwithstanding), though with an infinitely fast card and proper triple buffering (aka Fast Sync) you will still approach being an entire frame of input lag worse.

If your card can do 2x max Hz(a more realistic number than infinitely fast), it's half a frame worse, which would be ~3ms on a 144Hz monitor.


Frame capping . When I said "as opposed to rendering more frames" that's exactly fast sync or just no vsync whatsoever. Maybe the question was supposed to be frame capping vs vsync?

It's theoretically possible to approach no input lag even with vsync and drawing no extra frames, when using frame time prediction and delaying the rendering start. Valve has an experimental flag for this in Dota2. But I'm not sure if this can be done externally in the driver like with Fast Sync. Hey, maybe NVIDIA has a feature to sell you the 1180 here...

Anyway, we went way off on a tangent here. The point was that frame capping doesn't fix the input lag issue of vsync.

Yeah I mean turn vsync off and frame cap if you use gsync. Unless you're saying frame capping introduces input lag comparable to vsync

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

THE DOG HOUSE posted:

Yeah I mean turn vsync off and frame cap if you use gsync.

I'm in a bit of a fog today, but what does gsync do with vsync off?

movax
Aug 30, 2008

wicka posted:

i got my 1070 by camping newegg and refreshing the page 100 times a minute, i saw it become available and completed the order and it never came up on nowinstock

Honestly, I would pay Newegg an extra $50 if it meant getting my name on a list for the card to show up in the next few batches.

Klyith
Aug 3, 2007

GBS Pledge Week

fozzy fosbourne posted:

I'm also curious if the threshold for "distinguish and recognizing" is different than "perceiving blur when tracking a moving object" (especially in the context of sample and hold LCDs). It's either that or I'm one of your superhuman outliers :smug: (in which case you can consider me the founder of the brotherhood of evil mutants as of today)
A lot of these tests are based on flashing an image for a very short period of time and seeing if the subject can tell you what it was. The commonly cited case for fastest reactions was fighter pilots -- they could identify airplane silhouettes with just 1/200th of a second. That's likely a task that's well suited for the eye -- silhouettes mean outlines, and as per previous post the visual system allocates more bandwidth for outlines.

Motion on the other hand is more distinguishable with your peripheral vision. Back in the CRT days I could tell the difference between 60hz and 72hz refresh easily by looking at the screen out of the corner of my eye. So I'd say that anyone looking at ultrawide screens for gaming should keep that in mind and plan for above 60hz refresh and a good GPU delivering high FPS. Enough of the screen in ultrawides is in your peripheral that inconsistent motion might be more visible.


Subjunctive posted:

I'm in a bit of a fog today, but what does gsync do with vsync off?
Gsync on + vsync off = extra partial frames (and tearing) when GPU FPS is higher than the display's max refresh
Gsync on + vsync on = capped framerate (and a few ms of input lag) when ditto

Naffer
Oct 26, 2004

Not a good chemist

craig588 posted:

The reference spec for the 4GB version is 7GHz while the 8GB one is specced to 8GHz, but AMD said they will not be strictly enforcing that so AIB vendors will be free to use whatever speed they want.

The XFX, Sapphire, and PoerColor 4GB cards are all listed as having 8GHz memory on Newegg. They're all currently sold out but listed for $199.

penus penus penus
Nov 9, 2014

by piss__donald

Klyith posted:



Gsync on + vsync off = extra partial frames (and tearing) when GPU FPS is higher than the display's max refresh
Gsync on + vsync on = capped framerate (and a few ms of input lag) when ditto

And, in my head, I was thinking Gsync on + frame capping below 144 hz = capped framerate with less input lag, with no tearing since its never operating above the gsync range. I only brought this particular scenario up because I figured if you have gsync why bother with any kind of vsync variation at all. But I don't own one and im still trying to work through the earlier posts about why exactly frame capping introduces input lag similar to vsync.

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


So as long as the game you're playing caps under 144 FPS, you should just leave VSync off with a GSync monitor?

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

Tanreall posted:

You mean the same temps as the GTX 1070/1080 FE cards?

https://www.computerbase.de/2016-06/radeon-rx-480-test/8/

.. maybe? I don't speak german so I can't attest to what the specific charts they've got are (although from what I can tell you seem to be right). I was looking at this: http://techreport.com/review/30328/amd-radeon-rx-480-graphics-card-reviewed/12 along with reports of Pascal running comfortably in the mid-upper 60's.

Anyways, this isn't a hill I particularly care to die on. It was just a small throw-away observation about something it didn't seem like anyone else was mentioning.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

THE DOG HOUSE posted:

And, in my head, I was thinking Gsync on + frame capping below 144 hz = capped framerate with less input lag, with no tearing since its never operating above the gsync range. I only brought this particular scenario up because I figured if you have gsync why bother with any kind of vsync variation at all.

I think that's correct, but you'll need some margin below 144Hz.

quote:

But I don't own one and im still trying to work through the earlier posts about why exactly frame capping introduces input lag similar to vsync.

It doesn't in your scenario, I had something else (much more silly) in mind.

objects in mirror
Apr 9, 2016

by Shine

Gwaihir posted:

The 480 is a ~210$ 970, so an actual 970 for 160$ is a real good deal.


Twerk from Home posted:

Sounds like a good deal to me!

Picked it up, the guy was goony and also gave me a nice bag to hold the card in. Unfortunately I couldn't install it last night as I couldn't find a philips screwdriver.

The guy is still selling 2 more 970s for $160, for you goons near Washington, DC.

https://washingtondc.craigslist.org/nva/sop/5651249599.html

Endless Mike
Aug 13, 2003



Having a hell of a time getting a GTX 1070 installed. When I boot my computer, the boot screen shows, as expected, and then it turns black and my TV (Sony 810c 4K) says there's no signal. I tried swapping my 760 back in and it works just fine. Installed the drivers from there and tried again, and no go. I've uninstalled all the drivers and tried. Nothing seems to get this card to want to actually show. The LEDs on the side are slowly changing color and the fans are spinning so I assume it's getting power. Help.

EDIT: Needed to update my BIOS. Rolling my eyes so hard at myself right now.

Endless Mike fucked around with this message at 22:26 on Jun 30, 2016

Klyith
Aug 3, 2007

GBS Pledge Week

exquisite tea posted:

So as long as the game you're playing caps under 144 FPS, you should just leave VSync off with a GSync monitor?

in practice you should try it both ways depending on the game, because games are dumb and sometimes behave better with gsync if vsync is one way or the other
v:v:v

THE DOG HOUSE posted:

But I don't own one and im still trying to work through the earlier posts about why exactly frame capping introduces input lag similar to vsync.
Here's nvidia explaining in a pretty straightforward way:
https://www.youtube.com/watch?v=oTYz4UgzCCU

Basically if the GPU is going really fast the frame that's sitting ready in the buffer has time to get "stale" and the game engine is waiting for the GPU to take delivery of the next frame worth of data. In the degenerate case where the game is always ready and the GPU renders a buffer in zero time, you'd have 1 frame of input lag more than you'd have with vsync off. In that case fast sync would deliver zero lag.

Real world it's gonna be less than that. If the GPU can render 2x your refresh, you'd have 1/2 your refresh in lag. (Rendered and displayed frame -> rendered and discarded frame -> rendered and displayed frame; the input comes from the start of each of those frames.) 3 times, 1/3rd et cetera. It's kinda inefficient if you ask me. If you're really a pro gamer, why wouldn't you just accept the tearing?

Also keep in mind that in all these cases, the input lag we're talking about is only the extra specifically from these buffer waits -- you still have to add your display lag, game engine processing time, etc.

Klyith fucked around with this message at 20:43 on Jun 30, 2016

wicka
Jun 28, 2007


Endless Mike posted:

Having a hell of a time getting a GTX 1070 installed. When I boot my computer, the boot screen shows, as expected, and then it turns black and my TV (Sony 810c 4K) says there's no signal. I tried swapping my 760 back in and it works just fine. Installed the drivers from there and tried again, and no go. I've uninstalled all the drivers and tried. Nothing seems to get this card to want to actually show. The LEDs on the side are slowly changing color and the fans are spinning so I assume it's getting power. Help.

maybe related to this? https://forums.geforce.com/default/topic/946217/gtx-10-series-can-t-boot-correctly-past-330-mhz-dl-dvi-pixel-clock/

Endless Mike
Aug 13, 2003



Oh right I also tried connecting it to my older 1080p TV with the same results. They're both native 120 Hz, so I doubt that's related. I should also add that when I reconnected the 760, Windows updated its 1070 driver.

Also tried various inputs on my TV and receiver, all to the same effect.

Winged Orpheus
May 21, 2010

Domine, Dirige Nos
For those of you still hunting for cards, there's a chrome extension called distill that monitors elements of web pages for changes at an interval you can set. I got my 1070 by setting it to look at the "out of stock" and "auto notify" buttons on the newegg store page for the aib I wanted. You'll get a chrome notification, and can have it text/email you as well.

eggyolk
Nov 8, 2007


It's a bummer that the 480 isn't better. I'd hoped it would compete with the GTX 980 to help push prices down. Now instead of paying $$$ for a replacement of my 970 or a new monitor, I sprung for a Vive headset. Honestly it's about the price of a 1080 or XB271HU and it'll probably feel like a much more significant upgrade.

repiv
Aug 13, 2009

http://www.pcper.com/reviews/Graphics-Cards/Power-Consumption-Concerns-Radeon-RX-480

PC Perspective posted:

When overclocked, we witnessed motherboard PCIe slot +12V power draw at 95+ watts!

I asked around our friends in the motherboard business for some feedback on this issue - is it something that users should be concerned about or are modern day motherboards built to handle this type of variance? One vendor told me directly that while spikes as high as 95 watts of power draw through the PCIE connection are tolerated without issue, sustained power draw at that kind of level would likely cause damage. The pins and connectors are the most likely failure points - he didn’t seem concerned about the traces on the board as they had enough copper in the power plane to withstand the current.

In short, don't even think about overclocking a reference RX480.

penus penus penus
Nov 9, 2014

by piss__donald
I honestly didn't think that'd actually be a real issue lol or at least, it would turn out that it wasn't drawing that much power from the board and reviewers were just jumping the gun.

Captain Yossarian
Feb 24, 2011

All new" Rings of Fire"

THE DOG HOUSE posted:

I honestly didn't think that'd actually be a real issue lol or at least, it would turn out that it wasn't drawing that much power from the board and reviewers were just jumping the gun.

That's what my early comment was re:, but holy yowza lol

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

THE DOG HOUSE posted:

I honestly didn't think that'd actually be a real issue lol or at least, it would turn out that it wasn't drawing that much power from the board and reviewers were just jumping the gun.

You might want to look at the 750 Ti review by Tom's. 141 W max and a bunch of peaks over 100W. On phone or I'd link. So not exactly unprecedented. The overvolt numbers aren't pretty though.

Hopefully they can fix it though, it's not a good thing.

Craptacular!
Jul 9, 2001

Fuck the DH
The way NVidia's $200 price point has been in the lurch for so long is frustrating. I'm on the verge of buying used simply because the reshuffle everyone expected with the 1070/1080 didn't happen; and now it looks like the competition from the other side isn't so strong that it will budge them either.

Gonkish
May 19, 2004

If whatever is pulling that extra power (likely the GPU) were to have access to that power OUTSIDE of the PCI bus, it wouldn't need to pull it through the motherboard, right? So, in other words, an 8 pin connector would likely solve this issue outright? If so, going with a 6-pin was a hugely stupid decision on their part, but at least it's one that the AIB partners can rectify.

penus penus penus
Nov 9, 2014

by piss__donald

Gonkish posted:

If whatever is pulling that extra power (likely the GPU) were to have access to that power OUTSIDE of the PCI bus, it wouldn't need to pull it through the motherboard, right? So in other words, an 8 pin connector would likely solve this issue outright? If so, that's a hugely stupid decision on their part, but at least it's one that the AIB partners can rectify.

Oh yes, easy fix... for future cards. The 1070 has the same TDP (...), but uses a 8 pin connector for example.

I just finished the article and I am legitimately surprised. 27% more amps than the specification allows for is a more alarming way to think about it at stock speeds.

I wonder if they update the card via software to force it to 75 watts on the motherboard and simply overdraw on the power from the PSU, since that's practically safe.

penus penus penus fucked around with this message at 21:36 on Jun 30, 2016

fozzy fosbourne
Apr 21, 2010

Klyith posted:

Gsync on + vsync off = extra partial frames (and tearing) when GPU FPS is higher than the display's max refresh
Gsync on + vsync on = capped framerate (and a few ms of input lag) when ditto

And this is only true as of a driver update sometime in 2015, so you'll see a lot of stuff on the internet referencing the old behavior that g-sync launched with, which was:
If fps >= max_refresh, vsync turned on and framerate capped at max refresh, regardless of whether vsync was enabled in the control panel. When that was the behavior, there was a stronger motivation to cap your framerate to something lower than max refresh because having variable input lag if you were bouncing near fps == max_refresh sounds like it would be kind of poo poo, especially if your max refresh was something like 60 hz.

edit: replaced "refresh" with "max_refresh" for clarity (where max_refresh is the maximum refresh rate of your panel)

fozzy fosbourne fucked around with this message at 23:00 on Jun 30, 2016

repiv
Aug 13, 2009

xthetenth posted:

You might want to look at the 750 Ti review by Tom's. 141 W max and a bunch of peaks over 100W. On phone or I'd link. So not exactly unprecedented. The overvolt numbers aren't pretty though.

PCPers contact says momentary spikes over spec are a non-issue, it's continuous over-current that causes damage. The 750ti only averages 64W so it's fine by their reasoning.

AndrewP
Apr 21, 2010

Boy here's a good deal

Klyith
Aug 3, 2007

GBS Pledge Week

Craptacular! posted:

The way NVidia's $200 price point has been in the lurch for so long is frustrating. I'm on the verge of buying used simply because the reshuffle everyone expected with the 1070/1080 didn't happen; and now it looks like the competition from the other side isn't so strong that it will budge them either.

$200 has been the place that AMD has actually delivered good value for the last 3 years. If you and everyone else refuse to buy AMD cards even when they're the better choice in a price segment, then drat right nvidia is gonna force you to pay $300 for that choice.


THE DOG HOUSE posted:

I wonder if they update the card via software to force it to 75 watts on the motherboard and simply overdraw on the power from the PSU, since that's practically safe.

I doubt it's something you can do with software, but future revisions of the card (8-pin or whatever) will hopefully not make the same mistake.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Klyith posted:

Gsync on + vsync off = extra partial frames (and tearing) when GPU FPS is higher than the display's max refresh

I don't understand why gsync has an effect there. Don't you get the same tearing without gsync if you have vsync off and exceed the refresh rate?

(Do you say "max refresh rate" to distinguish from "currently-configured refresh rate", like running a max-144Hz display at 96Hz?)

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

repiv posted:

PCPers contact says momentary spikes over spec are a non-issue, it's continuous over-current that causes damage. The 750ti only averages 64W so it's fine by their reasoning.

Yeah, that's why I'm a lot less mad about the non-oc numbers. They're barely over. Oc is a different kettle of fish.

FormatAmerica
Jun 3, 2005
Grimey Drawer
Holy poo poo the new Pascal cards are huge. Got my 1070 today and it fits but just barely.

Running great, super quiet compared to my 970 which had a bad case of coil whine.

Time to overclock it! :getin:

penus penus penus
Nov 9, 2014

by piss__donald

FormatAmerica posted:

Holy poo poo the new Pascal cards are huge. Got my 1070 today and it fits but just barely.

Running great, super quiet compared to my 970 which had a bad case of coil whine.

Time to overclock it! :getin:

what 970 did you have? So far it looks like they're all about the same size as before except for ones specifically made small, but I can see going from one brand to another being startling.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Klyith posted:

I doubt it's something you can do with software, but future revisions of the card (8-pin or whatever) will hopefully not make the same mistake.

Actually, speaking of future revisions, AMD is tacitly acknowledging that Polaris is a bit hosed up, likely do to process and fab (as if it's a design issue then whoa boy is that a lot of money to dump).

So the new scheme is down to R for ≤1080p60, RX for ≥1080p60. First number is generation, second is performance tier (4, 5, and 6 are ~1080p, 7 and 8 are 1440p, and 9 is 4K), 0 represents first revision, a 5 represents second revision. So an RX485 is the guaranteed 1080p60, entry 1440p, second revision, while an R 460 is entry 1080p, first revision.

So expect the RX485/RX475 to drop 2017 with less power consumption and higher clocks. This also means no rebranding, 500 series is Navi period.

Klyith
Aug 3, 2007

GBS Pledge Week

Subjunctive posted:

I don't understand why gsync has an effect there. Don't you get the same tearing without gsync if you have vsync off and exceed the refresh rate?

(Do you say "max refresh rate" to distinguish from "currently-configured refresh rate", like running a max-144Hz display at 96Hz?)

gsync is a thing where the monitor refresh is variable and updates in time with the display buffer, under the control of the GPU. so with gsync you don't have a "currently-configured refresh rate".

When gsync is on but the GPU is delivering more frames than the monitor's maximum refresh rate, the monitor can't keep up so gsync is effectively bypassed. In that case, you fall back to the normal vsync on or off options.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

FaustianQ posted:

Actually, speaking of future revisions, AMD is tacitly acknowledging that Polaris is a bit hosed up, likely do to process and fab (as if it's a design issue then whoa boy is that a lot of money to dump).

So the new scheme is down to R for ≤1080p60, RX for ≥1080p60. First number is generation, second is performance tier (4, 5, and 6 are ~1080p, 7 and 8 are 1440p, and 9 is 4K), 0 represents first revision, a 5 represents second revision. So an RX485 is the guaranteed 1080p60, entry 1440p, second revision, while an R 460 is entry 1080p, first revision.

So expect the RX485/RX475 to drop 2017 with less power consumption and higher clocks. This also means no rebranding, 500 series is Navi period.

Any speculation at all about the difference between an R460 and RX460?

Adbot
ADBOT LOVES YOU

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Twerk from Home posted:

Any speculation at all about the difference between an R460 and RX460?

Everyone is too busy discussing the RX490/495, but here is my interpretation since an RX450/455 would handle any cutdown version of Polaris 11; An R460 would be an iGPU. Same with R450, 440, etc.That's the only reason I can think of to have an overlap with RX460/450 with R460/450.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply