Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Subjunctive
Sep 12, 2006

✨sparkle and shine✨

xthetenth posted:

To me it seems that with the split generations the Ti bracket makes no sense unless you absolutely have to have the best and are willing to work the used card market with 'old' cards. My 970 is less than a year old and sure I could get a bunch more performance, but it would come at twice the cost. If I'm going to upgrade every other half generation step, the price/performance is well on the side of the high midrange (I'm pissed that the midrange chips are now 50% more expensive though).

Isn't price/perf of 980 Ti about the same as that of 970?

Adbot
ADBOT LOVES YOU

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
The 980ti makes perfect sense if you're playing at 1440p or higher resolutions, it's not like there are any better options out there in applications where the 970 isn't fast enough.

LiquidRain
May 21, 2007

Watch the madness!

MaxxBot posted:

The 980ti makes perfect sense if you're playing at 1440p or higher resolutions, it's not like there are any better options out there in applications where the 970 isn't fast enough.
I feel like the thread forgets something a lot.

The 980Ti makes perfect sense at 1440p or above if you want to play the newest games with max detail at 60fps.

If $300ish in your pocket is worth more than some shiny polygons, it doesn't make sense.

The way people talk in here evokes a sense of buyers remorse in my 970, the reality being that for the games I play I just don't need the 980Ti power and in every other case I just turn down a knob or two.

FraudulentEconomics
Oct 14, 2007

Pst...
The real solution is B-stock 980s. Better than 970 for nearly the same price point. I had my heart set on a 980 ti but catching the 980 B-stock when it was available made me realize I was in love with the perceived power, not actually the card. It also led me to examine overclocking and made me comfortable with it.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

LiquidRain posted:

If $300ish in your pocket is worth more than some shiny polygons, it doesn't make sense.

Yeah, I went that route, although I honestly probably should have a 980ti considering the monitor I run and how much I've spent on monitors this year.

Subjunctive posted:

Isn't price/perf of 980 Ti about the same as that of 970?

Could've sworn it was less, I thought it was something like twice the price for 50% more performance. Great deal for a high end card, but definitely past the optimum point.


FraudulentEconomics posted:

The real solution is B-stock 980s. Better than 970 for nearly the same price point. I had my heart set on a 980 ti but catching the 980 B-stock when it was available made me realize I was in love with the perceived power, not actually the card. It also led me to examine overclocking and made me comfortable with it.

These days yes it is. A 970 when they were new (what I did) or a good aftermarket 290 when the price collapsed were also good picks.

SlayVus
Jul 10, 2009
Grimey Drawer

LiquidRain posted:

I feel like the thread forgets something a lot.

The 980Ti makes perfect sense at 1440p or above if you want to play the newest games with max detail at 60fps.

If $300ish in your pocket is worth more than some shiny polygons, it doesn't make sense.

The way people talk in here evokes a sense of buyers remorse in my 970, the reality being that for the games I play I just don't need the 980Ti power and in every other case I just turn down a knob or two.

1080p 980 Ti here. It's amazing even at 1080p, but for $700 I think I am going to try and get a better monitor down the line. Still running a 32' 120hz Vizio TV.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

I honestly don't get low enough framerates that they bother me on what I'm playing these days even at 3440x1440. I'm just going to wait for Arctic Islands if it's any good, and if not getting a Pascal.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

xthetenth posted:

Could've sworn it was less, I thought it was something like twice the price for 50% more performance. Great deal for a high end card, but definitely past the optimum point.

Correct, although due to SLI scaling a single 980 Ti works out to being in the same ballpark as SLI 970s. So if you think you might ever want to get a second card for SLI you should probably just buy a 980 Ti in the first place and get the benefits of running a single card.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

LiquidRain posted:

I feel like the thread forgets something a lot.

The 980Ti makes perfect sense at 1440p or above if you want to play the newest games with max detail at 60fps.

If $300ish in your pocket is worth more than some shiny polygons, it doesn't make sense.

The way people talk in here evokes a sense of buyers remorse in my 970, the reality being that for the games I play I just don't need the 980Ti power and in every other case I just turn down a knob or two.

People put far too much emphasis on "maxing out" games these days anyway, since there was a good 5 years or so where even a toaster could run max settings on the basic console ports we were getting. Turns out when you're getting games designed to run on a ~250 gflop system it's easy to crank poo poo up.

But when you look at something like The Witcher 3's hairworks and the ultra foliage setting, those are completely unreasonable on even the beefiest systems, but we shouldn't care that a 970 can't turn that stuff on and hit the sweet sweet six oh eff pee ess since there will be a day in the future where turning that stuff on could run on a modest system without breaking a sweat and we'll be glad that they're there.

More games should ship with settings we can't really use on current systems because I can boot up Crysis 3 and it still looks loving great.

Truga
May 4, 2014
Lipstick Apathy
http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/1200#post_24356995
On async shaders and nvidia:

quote:

By vendor specific, I mean a case where we look at the Vendor ID and make changes to our rendering path. Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware. As far as I know, Maxwell doesn't really have Async Compute so I don't know why their driver was trying to expose that.
Yikes.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Paul MaudDib posted:

Correct, although due to SLI scaling a single 980 Ti works out to being in the same ballpark as SLI 970s. So if you think you might ever want to get a second card for SLI you should probably just buy a 980 Ti in the first place and get the benefits of running a single card.

You also want the 980 Ti instead of SLI 970s because in any case where SLI 970s would make sense you also want more VRAM than that setup can provide.

cat doter posted:

People put far too much emphasis on "maxing out" games these days anyway, since there was a good 5 years or so where even a toaster could run max settings on the basic console ports we were getting. Turns out when you're getting games designed to run on a ~250 gflop system it's easy to crank poo poo up.

But when you look at something like The Witcher 3's hairworks and the ultra foliage setting, those are completely unreasonable on even the beefiest systems, but we shouldn't care that a 970 can't turn that stuff on and hit the sweet sweet six oh eff pee ess since there will be a day in the future where turning that stuff on could run on a modest system without breaking a sweat and we'll be glad that they're there.

More games should ship with settings we can't really use on current systems because I can boot up Crysis 3 and it still looks loving great.

I remember seeing a video of someone running TW3 with everything on including hairworks at UHD resolution and getting around 50+ FPS, I think the setup was an 8C/16T i7 OCed to like 4.7GHz with triple SLIed 980 Tis or something ridiculous like that.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

xthetenth posted:

Could've sworn it was less, I thought it was something like twice the price for 50% more performance. Great deal for a high end card, but definitely past the optimum point.

Yeah, depends on the content and resolution for sure. In http://www.pcgamer.com/nvidia-geforce-gtx-980-ti-review/ or http://www.maximumpc.com/nvidia-geforce-gtx-980-ti-review/ for 1440/2160 you see well > 50% gains, but it's unclear whether the content is just GPU-bound on the 980 Ti.

Truga
May 4, 2014
Lipstick Apathy
According to most of those graphs in the 2nd link, a 980Ti is basically a 970 SLI for twice the price of a single 970. Which is great, cause I don't have to bother with SLI BS anymore :v:

Truga fucked around with this message at 02:51 on Aug 31, 2015

Anime Schoolgirl
Nov 28, 2002

I'm still laughing about how a 290, which is now priced at nearly bottom bin, will go three to four straight years of being a viable purchase if DX12 proliferation is even a quarter of what AMD needs.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Seriously a 290x equaling a 980ti in dx12 applications could be the most exciting thing to happen in GPUs in years. I really hope it ends up with AMD clawing back some market share because the situation is real scary right now, what with nvidia's almost complete dominance and the almost delusional steps AMD defenders have to go to to make them look good.

Truga
May 4, 2014
Lipstick Apathy
Ashes is quite the edge case, since it's a supcom/PA that doesn't reduce units to icons when you zoom out a bit. I wouldn't draw too many conclusions from that. Same post is also saying UE4 doesn't support async shaders, and ue4 is going to be *the* engine for a couple years now it seems so... On the other hand, console devs say async shaders give them a 30% FPS boost on ps4/xbone so who knows.

I just found the part where nvidia exposes some dx12 feature in their driver, probably to check a box so they can say "dx12 level n compatible", but said feature doesn't actually work, funny.

Deathreaper
Mar 27, 2010
I purchased a pair of used Saphire R9 290 Tri OC for $500 a year and a half ago for my 2560x1600 monitor. At the time, I was kind of sad I couldn't find a pair of 780' / 780ti s for the price. I can tell you i'm pretty happy I got the 290s considering their performance improvements over time and what seems to be neglect from Nvidia on the 6/7 series. If the DX12 performance continues to be impressive, I think I could keep the cards for another year or two... Power consumption and heat output are sub optimal but it's a non issue in the land of 6c / kWh.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Truga posted:

Ashes is quite the edge case, since it's a supcom/PA that doesn't reduce units to icons when you zoom out a bit. I wouldn't draw too many conclusions from that. Same post is also saying UE4 doesn't support async shaders, and ue4 is going to be *the* engine for a couple years now it seems so... On the other hand, console devs say async shaders give them a 30% FPS boost on ps4/xbone so who knows.

I just found the part where nvidia exposes some dx12 feature in their driver, probably to check a box so they can say "dx12 level n compatible", but said feature doesn't actually work, funny.

If 30% is true then I'd take any bet that epic are implementing it as we speak, they're not one to let that much performance just dangle in the wind.

The 900 series not having async compute just seems strange to me. I wonder why they're surfacing it in their drivers if they don't have it on hardware, that seems like a really un-nvidia thing to do.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

cat doter posted:

The 900 series not having async compute just seems strange to me. I wonder why they're surfacing it in their drivers if they don't have it on hardware, that seems like a really un-nvidia thing to do.

Clearly they're just practicing for Pascal.

Anime Schoolgirl
Nov 28, 2002

Truga posted:

Ashes is quite the edge case, since it's a supcom/PA that doesn't reduce units to icons when you zoom out a bit. I wouldn't draw too many conclusions from that. Same post is also saying UE4 doesn't support async shaders, and ue4 is going to be *the* engine for a couple years now it seems so... On the other hand, console devs say async shaders give them a 30% FPS boost on ps4/xbone so who knows.
They're gonna have to support those if they want UE4 games to run smoothly at all for the consoles we're stuck with for 3 more years. Unless if they're okay with every UE4 game capping at 20-30fps, then well.

repiv
Aug 13, 2009

Truga posted:

Ashes is quite the edge case, since it's a supcom/PA that doesn't reduce units to icons when you zoom out a bit.

:agreed:

UE4s DX12 path is available in the public repo now, so it'll be interesting compare how that performs as it matures.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

cat doter posted:

If 30% is true then I'd take any bet that epic are implementing it as we speak, they're not one to let that much performance just dangle in the wind.

The 900 series not having async compute just seems strange to me. I wonder why they're surfacing it in their drivers if they don't have it on hardware, that seems like a really un-nvidia thing to do.

It'd be like outright lying about how many ROPs a card has and how much of its memory it can access at once.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

xthetenth posted:

It'd be like outright lying about how many ROPs a card has and how much of its memory it can access at once.

Heh, right. I think this case is a little different. Usually nvidia surfaces stuff in their drivers that's a win for them but this seems odd.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

cat doter posted:

Heh, right. I think this case is a little different. Usually nvidia surfaces stuff in their drivers that's a win for them but this seems odd.

Maybe they've got a terrible implementation now that they hope they can sort out. No idea, it's early times yet. I just hope we keep on having two companies making video cards.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

xthetenth posted:

Maybe they've got a terrible implementation now that they hope they can sort out. No idea, it's early times yet. I just hope we keep on having two companies making video cards.

Yeah, same, I hope async compute is a big thing for AMD. They could certainly use a win.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

PC ports of console games should tilt in AMD's favour if this is true, which would be quite the development. Console-first houses are likely to be less susceptible to NVIDIA's devrel charms.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Subjunctive posted:

PC ports of console games should tilt in AMD's favour if this is true, which would be quite the development. Console-first houses are likely to be less susceptible to NVIDIA's devrel charms.

Everyone is susceptible to co-marketing dollars. Nvidia has a lot of it.

Wistful of Dollars
Aug 25, 2009

I still have a 290(x) if someone wants to buy it :sun:

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

cat doter posted:

Everyone is susceptible to co-marketing dollars. Nvidia has a lot of it.

Yes, but I said "less" rather than "not at all", and I was referring to their developer relations function rather than marketing.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Subjunctive posted:

Yes, but I said "less" rather than "not at all", and I was referring to their developer relations function rather than marketing.

I feel like those kinda go hand in hand for nvidia, more focus in the areas of their marketing, though I guess Arkham Knight kinda disproves that theory.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

cat doter posted:

The 900 series not having async compute just seems strange to me. I wonder why they're surfacing it in their drivers if they don't have it on hardware, that seems like a really un-nvidia thing to do.

The drivers support it on paper, it's just that without actual hardware support it's slower than not using it. They checked the box for marketing purposes.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

cat doter posted:

I feel like those kinda go hand in hand for nvidia, more focus in the areas of their marketing, though I guess Arkham Knight kinda disproves that theory.

I've only worked with their devrel side, but my understanding is that the co-marketing stuff doesn't really play into technology choices.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Subjunctive posted:

PC ports of console games should tilt in AMD's favour if this is true, which would be quite the development. Console-first houses are likely to be less susceptible to NVIDIA's devrel charms.

There's also the issue that the consoles themselves have AMD GPUs.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
So what's again the release window of Pascal? If NVidia doesn't manage to retrofit working Async Compute on it, they've dropped the ball so much they're hosed.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Combat Pretzel posted:

So what's again the release window of Pascal? If NVidia doesn't manage to retrofit working Async Compute on it, they've dropped the ball so much they're hosed.
I think we're jumping the gun just a tad here, it seems FAR more likely that this one developer was either using it wrong or encountered a driver bug than that Maxwell 2 doesn't actually support this claimed feature in hardware.

Daviclond
May 20, 2006

Bad post sighted! Firing.
There were rumours of the big Pascal GP100 chip taping out a couple of months ago, the die are almost certainly already cast at this stage. Here's hoping Nvidia did the proper analysis of the impacts of DX12 on the optimal architecture!

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness

Truga posted:

Same post is also saying UE4 doesn't support async shaders, and ue4 is going to be *the* engine for a couple years now it seems so...

This is kind of a minor point but I don't think UE4 is or will be the default engine in the way that UE3 was. EA, Ubisoft, Square Enix and Activision all have their own in house engines and a lot of the 3D indie stuff is on Unity. The only recent major Unreal Engine games that come to mind are Arkham Knight and MK X from WB Games and both of those are UE3. I don't think any triple A games this fall or early next year are gonna use it either.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Biggest human being Ever posted:

I don't think any triple A games this fall or early next year are gonna use it either.

Wiki has quite a few titles I've already played or are coming out soon as using UE4: https://en.wikipedia.org/wiki/List_of_Unreal_Engine_games#Unreal_Engine_4

Ark, Daylight, Dead Island 2, Dragon Quest XI, Eve: Valkyrie, Fable Legends, Street Fighter V, Tekken 7 are a few notables.

Edit: UE4 is also the go-to for any high-end VR titles right now, they're even making VR movies with it for the Rift

Zero VGS fucked around with this message at 15:39 on Aug 31, 2015

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Biggest human being Ever posted:

This is kind of a minor point but I don't think UE4 is or will be the default engine in the way that UE3 was. EA, Ubisoft, Square Enix and Activision all have their own in house engines and a lot of the 3D indie stuff is on Unity.

Umm what? There are a ton of AAA titles coming out using UE4 by the very same developers you mention ;)

Adbot
ADBOT LOVES YOU

ijyt
Apr 10, 2012

Zero VGS posted:

Wiki has quite a few titles I've already played or are coming out soon as using UE4: https://en.wikipedia.org/wiki/List_of_Unreal_Engine_games#Unreal_Engine_4

Ark, Daylight, Dead Island 2, Dragon Quest XI, Eve: Valkyrie, Fable Legends, Street Fighter V, Tekken 7 are a few notables.

Edit: UE4 is also the go-to for any high-end VR titles right now, they're even making VR movies with it for the Rift

Ark is UE4? Genuinely couldn't tell.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply