Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

sout posted:

Is there a significant reason to get a 1070 over a 980 Ti?

They cost about the same right now new, 1070 is a bit faster on most tests, it uses less power, and it has more features. If you are buying new I'd say 1070 is a clear winner. Used it's up to you on how you value a new vs used piece of hardware.

Adbot
ADBOT LOVES YOU

Geemer
Nov 4, 2010



Zotix posted:

I feel like I'm missing something here with the founders edition cards. Why ever buy one over one with better cooling and clocks? Then they are charging nearly $100 more for it.

Because you live in Europe and FE cards are actually cheaper than the nice ones? :negative:

Truga
May 4, 2014
Lipstick Apathy

Zotix posted:

I feel like I'm missing something here with the founders edition cards. Why ever buy one over one with better cooling and clocks? Then they are charging nearly $100 more for it.

Some people like to get reference cards to put water blocks on, why not skin them further? :v:

repiv
Aug 13, 2009

THE DOG HOUSE posted:

Do you have a gsync monitor? I don't recall any gsync options at all in the drop down menu. I'm at work unfortunately.

I have a G-Sync monitor, and I didn't mean to imply the G-Sync settings are under the Vertical Sync menu. The VSync menus settings just change depending on whether G-Sync is enabled elsewhere.

G-Sync on, no Fast Sync option:


G-Sync off, still no Fast Sync option:

SlayVus
Jul 10, 2009
Grimey Drawer

repiv posted:

I have a G-Sync monitor, and I didn't mean to imply the G-Sync settings are under the Vertical Sync menu. The VSync menus settings just change depending on whether G-Sync is enabled elsewhere.

G-Sync on, no Fast Sync option:


G-Sync off, still no Fast Sync option:


Do you have the latest Nvidia drivers?

repiv
Aug 13, 2009

Yep, 368.39.

fozzy fosbourne
Apr 21, 2010

Do you have a non-pascal card? I think you need to use the inspector hack for non-pascal

I also was under the impression that fast sync was actually detrimental if you didn't have a frame rate that was a stable multiple of your refresh rate, based on accounts from the blur busters forums and elsewhere, but dog house's first hand experience is intriguing. I might try to empirically test this later with frame caps in something like overwatch.

Spatial
Nov 15, 2007

I was just googling what fast sync is, and it's triple buffering for DirectX games (as opposed to a swap chain with size 3). It's nice of Nvidia to fix Microsoft's poor design for them. :v:

Gwaihir
Dec 8, 2009
Hair Elf

Zotix posted:

I feel like I'm missing something here with the founders edition cards. Why ever buy one over one with better cooling and clocks? Then they are charging nearly $100 more for it.

Founder's/reference cards are deliberately priced 100$ higher so that Nvidia does not directly compete with their board partners (EVGA, MSI, ASUS, Gigabyte, Zotac, etc, etc).

They are the same cards that have shown up at Best Buys in the past, over priced and with plain Nvidia branding. (One of my coworkers inexplicably bought an Nvidia reference GTX 980 from BestBuy for something like 70$ over typical 980 prices, for god knows what reason. People do random stuff).

For some reason they decided to advertise it this time though. The reason is possibly "Gouge dumb nerds that can't wait" or just "because marketing felt like it."

There is absolutely no reason to buy one, unless you need a blower and really want the nvidia vapor chamber blinged out blower version instead of a baseline blower card for 100$ less.

craig588
Nov 19, 2005

by Nyc_Tattoo

SlayVus posted:

I don't know why the 8gb version has a 2GB/s increase speed though. Does it have an increased bus width or just different speeds?

The reference spec for the 4GB version is 7GHz while the 8GB one is specced to 8GHz, but AMD said they will not be strictly enforcing that so AIB vendors will be free to use whatever speed they want.

Truga
May 4, 2014
Lipstick Apathy
Forcing triple buffering through that weird 3rd party russian radeon thing is basically how I got rid of crossfire microstutter completely in almost every game. It's insane how it's not a default option.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

As far as best non-founders goes, the difference between any non-blower card and the founders is going to be bigger than between the best and any decent non-blower card. Get a good price, don't buy a factory OC if you do it manually, and if you like a certain brand's support go with them. There's no minefield like evga's 900 series cooling as far as I know, and there'd be comment if there was, that broke fast.

repiv posted:

They got one 480 up to 1.4ghz, but tested three more cards and only got them to 1.33-1.35ghz, plus PCGH.de also only got 1.35ghz with custom cooling. Lottery odds aren't looking great so far.

Yeah, it's screaming low parametric yields getting fixed the Dr. Frankenstein way between that, the asic quality variation of review samples and the massive power draw differences between review samples. Tbh, probably for the best given the dilemma, better that problem than higher prices and lower supply. I'd laugh if there was poo poo like specially binned factory oc variants of a $200 card with high end coolers going for $300 or something though, especially if they were actually worth it.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

SlayVus posted:

If the speed is 4000 or 8000 mhz, it's been messed with. The base clock is 2000. GDDR5 on a basic level quadruples the speed to make it effectively 8000.

The GDDR5 on the RX 480 is a quad pumped data rate. I don't know why the 8gb version has a 2GB/s increase speed though. Does it have an increased bus width or just different speeds?

I thought it was 7000 (effective) on the 4GiB card vs 8000 for the 8GiB one.

movax
Aug 30, 2008

fletcher posted:

Can't tell if maybe he just added it to his own alerts or what. I would have thought it would come up that was goon run earlier in this thread since so many of us were using it

I wish, but I am just an user of it.

Signed up on eVGA's website and never heard anything, so figured I'd try nowinstock.

repiv
Aug 13, 2009

HalloKitty posted:

I thought it was 7000 (effective) on the 4GiB card vs 8000 for the 8GiB one.

AMDs reference design says to use 7gbps-rated GDDR5 chips on the 4GB version and 8gbps-rated chips on the 8GB version, but I don't think there's anything to stop AIBs putting 4GB of 8gbps on a card.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

HalloKitty posted:

I thought it was 7000 (effective) on the 4GiB card vs 8000 for the 8GiB one.

I believe the reason given is to open the door for lower spec VRAM in the capacity for the 4 GB variant, while the chips for 8 GB are more reliably fast. However factory memory oc may just be standard.

Zotix
Aug 14, 2011



What happened with the 900 series cards?

NihilismNow
Aug 31, 2003
Oh hey it is the end of June i guess i can order one of those spiffy $200 RX480 cards now.
~Card is actually $365 with a reference cooler~

At that point i might as well buy a GTX 970 which is significantly cheaper (was prepared for it actually costing €250 for a 8gb with decent cooler, €300+ is just nope).

Klyith
Aug 3, 2007

GBS Pledge Week

Spatial posted:

Human vision doesn't have a framerate. It's a continuous analog system. :negative:
a) I was trying to be concise rather than :goonsay:

b) It's definitely not continuous. Flicker in rapidly rotating objects, persistence of vision, and a number of optical illusions work because there are discrete events. Nothing with nerve cells is continuous, neurons have a "tick rate" just like transistors. But different types run at different speeds.

But it's not a CCD camera either. For one thing, the optic nerve doesn't have enough bandwidth to work like a CCD camera. A few years ago I read a really neat article about scientists trying to extract images from the optic nerve, and trying to decipher the biological data compression -- the pictures they were able to reconstruct were mostly just outlines.

Phlegmish
Jul 2, 2011



Geemer posted:

Because you live in Europe and FE cards are actually cheaper than the nice ones? :negative:

Yeah, most of the FE cards I've seen are about the same as or even cheaper than the customs. MSRP means very little right now.

e: looking at this webshop I don't see a single 1070 below €500, so gently caress that.

Lungboy
Aug 23, 2002

NEED SQUAT FORM HELP
I was waiting to see what the 480 was like before deciding what to upgrade to from my 560gtx. With the firesales going on, an Evga sc is £200 and a FTW + is £230. Other than dx12, would I be missing anything by going for the 970 over the 480? If the 970 is the better option, is the FTW+ worth the extra over the superclocked?

Truga
May 4, 2014
Lipstick Apathy

NihilismNow posted:

Oh hey it is the end of June i guess i can order one of those spiffy $200 RX480 cards now.
~Card is actually $365 with a reference cooler~

At that point i might as well buy a GTX 970 which is significantly cheaper (was prepared for it actually costing €250 for a 8gb with decent cooler, €300+ is just nope).

Where are you seeing these prices? They're 269e on geizhals

e: Also you can get a sapphire 4gb for 219e which is just as good.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Lungboy posted:

I was waiting to see what the 480 was like before deciding what to upgrade to from my 560gtx. With the firesales going on, an Evga sc is £200 and a FTW + is £230. Other than dx12, would I be missing anything by going for the 970 over the 480? If the 970 is the better option, is the FTW+ worth the extra over the superclocked?

Main concern I'd have is that I'd expect the 480 to do better relatively in newer games (its worst showings are in older games and it's a good DX12 performer, plus it's a new card and the 970 is a last generation card so drivers might not focus on it as much), but if the 970's cheaper it's not a bad buy.

Truga
May 4, 2014
Lipstick Apathy

Lungboy posted:

I was waiting to see what the 480 was like before deciding what to upgrade to from my 560gtx. With the firesales going on, an Evga sc is £200 and a FTW + is £230. Other than dx12, would I be missing anything by going for the 970 over the 480? If the 970 is the better option, is the FTW+ worth the extra over the superclocked?

If you're at all in the market for monitors, g-sync has a $200 premium over freesync, I think that's about it.

penus penus penus
Nov 9, 2014

by piss__donald

fozzy fosbourne posted:

Do you have a non-pascal card? I think you need to use the inspector hack for non-pascal

I also was under the impression that fast sync was actually detrimental if you didn't have a frame rate that was a stable multiple of your refresh rate, based on accounts from the blur busters forums and elsewhere, but dog house's first hand experience is intriguing. I might try to empirically test this later with frame caps in something like overwatch.

Well blur busters can get pretty serious and my test has been with one game, overwatch lol. But I can certainly say for a fact with it on there is no visually obvious tearing like I get when its off completely, vsync works just as well but the input lag is obnoxious. And I'm just pushing 70 fps.

lDDQD
Apr 16, 2006

Klyith posted:

a) I was trying to be concise rather than :goonsay:

b) It's definitely not continuous. Flicker in rapidly rotating objects, persistence of vision, and a number of optical illusions work because there are discrete events. Nothing with nerve cells is continuous, neurons have a "tick rate" just like transistors. But different types run at different speeds.

But it's not a CCD camera either. For one thing, the optic nerve doesn't have enough bandwidth to work like a CCD camera. A few years ago I read a really neat article about scientists trying to extract images from the optic nerve, and trying to decipher the biological data compression -- the pictures they were able to reconstruct were mostly just outlines.

At some point, sampling definitely occurs: this is obvious because you're vulnerable to aliasing. However, nobody ever said a system that performs sampling must necessarily be discrete.

Siets posted:

Thinking along the lines of the earlier "4K is probably overkill right now" conversation and also hearing all of the benefits of GSync, I'm now intrigued by the idea of pairing a GTX 1080 with a 1440p GSync monitor. Anybody have any "swear-by" recommendations in this category?

Acer XB271HU.

lDDQD fucked around with this message at 17:48 on Jun 30, 2016

NihilismNow
Aug 31, 2003

Truga posted:

Where are you seeing these prices? They're 269e on geizhals

e: Also you can get a sapphire 4gb for 219e which is just as good.

Alternate has every 8gb version at €319 . Some shops have slightly lower prices but don't have stock. I'll wait and see how much the aftermarket cooler versions are i guess.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
So I just got the 399.99 Gigabyte GTX 1070 in.

First benchmark I ran was the Firestrike Ultra benchmark, because it was VRAM limited on my 2GB GTX 770. My score went from 733 to 4388 eliminating the VRAM bottleneck. SteamVR score of 10.9.

Out the box the card boosted up to 1924 mhz, temps maxed out at 71 degrees. The cooler seems pretty quiet so far as well, better than my old Zotac 770. Really like how these modern cards have fans that shut down when you're not putting significant load on the GPU, makes my PC almost silent.

Klyith
Aug 3, 2007

GBS Pledge Week

lDDQD posted:

At some point, sampling definitely occurs: this is obvious because you're vulnerable to aliasing. However, nobody ever said a system that performs sampling must necessarily be discrete.
And I didn't say it was, I even put "framerate" in those quote marks which should indicate that it's not exact. But if people really want to be spergs (and I guess this is the thread for them), it could be put like our visual system can distinguish and recognize images displayed for between 1/45 and 1/90 of a second, for most of the population, with outliers able to do so past 1/200th of a second.



edit: but to keep things on GPU topic, fast sync seems pretty limited in usefulness. The amount of latency added by a full buffer when framerate > refresh rate is pretty low, I can't imagine anything but competitive FPS needing that. Maybe mobas? But the other thing it does is throw away a lot of work, so the tradeoff is that you're keeping the GPU fully loaded (and the fans at 100%). Doesn't seem worth it for SP games.

Nice that it works in borderless windowed though, it's pretty annoying that AMD's vsync features like frame rate control only work fullscreen.

SlayVus
Jul 10, 2009
Grimey Drawer

HalloKitty posted:

I thought it was 7000 (effective) on the 4GiB card vs 8000 for the 8GiB one.

I was just using his example of 8000mhz. Whatever the speed is, it's a quad pumped speed. So 8000 is actually 2000 and 7000 is actually 1750.

fozzy fosbourne
Apr 21, 2010

Klyith posted:

And I didn't say it was, I even put "framerate" in those quote marks which should indicate that it's not exact. But if people really want to be spergs (and I guess this is the thread for them), it could be put like our visual system can distinguish and recognize images displayed for between 1/45 and 1/90 of a second, for most of the population, with outliers able to do so past 1/200th of a second.

I'm also curious if the threshold for "distinguish and recognizing" is different than "perceiving blur when tracking a moving object" (especially in the context of sample and hold LCDs). It's either that or I'm one of your superhuman outliers :smug: (in which case you can consider me the founder of the brotherhood of evil mutants as of today)

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

PC LOAD LETTER posted:

How so? The whole point of many of those slides was that they'd improved the general durability of the chip and allowed it to age more gracefully in response to changes in voltage (as stuff like the VRM hardware degrades) over time! You know the 1080's, 1070's, and every other GPU (or for that matter all electrical components) ever has this issue too right? They all degrade with time and use. Your posting in such a way as to present + interpret their efforts to address this in the worst light possible and it doesn't make any sense to me.

Yes, every other GPU / IC has this issue (although as I said before, running at much higher temps like the 480 seems to do will accellerate this process).

Traditionally, the approach is "Let's pick a safety margin so that even given a certain degree of aging the majority of chips will not show any problems".
AMDs new approach appears to be "Lets set the clocks as high as we can get away with rather than worrying about a safety margin and degrade the performance over time in response to actual detected changes".

The AMD approach might provide for a better overall experience, because you get higher stock clocks at the start, and you only use as much "safety margin" as you need given that particular card. It also means that up to a certain point a card that might otherwise start failing as it ages will just step itself down instead.

The question, which i think is legitimate and which I ask in a curious and not necessarily accusatory way is: How much is that drop actually going to be in practice? If it's only a small amount for most cards, then it seems like an added benefit to the consumer -- kind of an extension of boost clocking, where the GPU is finding its own maximum stable speed and saving you the trouble of overclocking manually to figure it out.

Green Gloves
Mar 3, 2008
Isn't gsync and fast sync supposed to complement each other. Where gsync is great for any range at or below the refresh rate of your monitor and fast sync is good if you play a twitchy shooter with unlocked frames and the fps is going way higher than the refresh rate of your monitor?

Probably should discuss this in the other thread haha.

In other news I have shown great discipline in not opening my Gigabyte G1 1070 and selling it on Ebay. I just wish I made more than $50. But that just means I will be spending $50 less on a 1070. Awesome!

penus penus penus
Nov 9, 2014

by piss__donald

Green Gloves posted:

Isn't gsync and fast sync supposed to complement each other. Where gsync is great for any range at or below the refresh rate of your monitor and fast sync is good if you play a twitchy shooter with unlocked frames and the fps is going way higher than the refresh rate of your monitor?

Probably should discuss this in the other thread haha.

In other news I have shown great discipline in not opening my Gigabyte G1 1070 and selling it on Ebay. I just wish I made more than $50. But that just means I will be spending $50 less on a 1070. Awesome!

I read that as "they don't interfere with each other" rather than compliment, but it would make sense on the face of it. Above 144hz gsync would start acting like vsync... But id argue that you should just frame cap to avoid all the issues since gsync will handle everything below 144hz anyway. No input lag, no extra vsync load on the gpu

Tanreall
Apr 27, 2004

Did I mention I was gay for pirate ducks?

~SMcD

Hubis posted:

Yes, every other GPU / IC has this issue (although as I said before, running at much higher temps like the 480 seems to do will accellerate this process).

You mean the same temps as the GTX 1070/1080 FE cards?

https://www.computerbase.de/2016-06/radeon-rx-480-test/8/

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

THE DOG HOUSE posted:

you should just frame cap ... No input lag

This is an oft repeated myth. Frame capping still adds input lag (as compared to rendering more frames when the card is capable of it).

movax
Aug 30, 2008

drat, <1 minute from SMS alert and 1080 FTWs gone on Newegg. Motherfucker! :argh:

wicka
Jun 28, 2007


movax posted:

drat, <1 minute from SMS alert and 1080 FTWs gone on Newegg. Motherfucker! :argh:

i got my 1070 by camping newegg and refreshing the page 100 times a minute, i saw it become available and completed the order and it never came up on nowinstock

Green Gloves
Mar 3, 2008

Skuto posted:

This is an oft repeated myth. Frame capping still adds input lag (as compared to rendering more frames when the card is capable of it).

If that is the case. What would create more input lag: frame capping vs fast sync?

Adbot
ADBOT LOVES YOU

averox
Feb 28, 2005



:dukedog:
Fun Shoe
I keep debating on whether I should spring for a FTW or Strix OC because that's what I had my heart set on but keep going back on the fact that this Gigabyte 1080 G1 should be just fine. But, I mean, maaaybee.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply