Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Mr.PayDay
Jan 2, 2004
life is short - play hard
Total Biscuit showed in his "WTF is Shadow of Mordor" the Texture Quality Settings. The "Ultra" Texture Packs needs 6 GB of VRAM.
When I buy a 970 or 980 SLI System, does the VRAM add up? Am I still limited to 4 GB VRAM or do both nvidias add VRAM up to 8 GB VRAM?
Sorry if this has been asked before, the The FAQ in the OP does not answer that.

Adbot
ADBOT LOVES YOU

Mr.PayDay
Jan 2, 2004
life is short - play hard
allright, thank you for the fast answer, lads

Mr.PayDay
Jan 2, 2004
life is short - play hard
Hell broke lose in the German Gaming and Computertech-Forums, nvidia faces the Nerd jihad.
Yes, they lied about the specs.
But does this make the card worse? Some raging fellas claimed they would not have bought the 970 with 3,5 GB because of 4K Gaming. Seriously, who thought the 970 or 980 would be 4k cards to begin with?

And how many of the raging gamers would have noticed the issue?

My 970 eVGA SLI Rig has 17400 Firestrike points and playing on my ASUS Swift @2560*1440 up to 144 frames via G-Sync is just gaming porn and makes my gaming buddies jealous. I am the 1 % :dance:
Can't wait for playing GTA V on my system.

I hope AMD exploits nvidias marketing desaster. Would be best for us customers.
After all the 970 is a great GPU and still best in slot imho for that price. Don't hesitate buying 1 or 2 for 1080p and 1440p Gaming.
Panic sellers might let you get some for way below 300 bucks.

Mr.PayDay
Jan 2, 2004
life is short - play hard

fletcher posted:

Your post convinced me to proceed with my plan to upgrade my 4GB 770 to 970 SLI. I'm gaming at 1080p and I definitely want that sweet sweet FPS for my shiny new G-Sync monitor.

Do it! It is just an amazing gaming experience!

So Nvidia hosed the specs up. They lied. This is infuriating.
We all agree here.
But suddenly we should pretend the 970s are bad because everyone plays on 4K res and is playing Far Cry 4 all day with fps drops to 0 or everyone plays other crappy ports and games with poor Memory allocation and VRAM Management.
So we forget that overclocking of 970s allows benchmark results that even pass stock 980 results, even with only 56 ROPs and 1792 KB L2 Cache and only 224 Bit Bus for the 3,5 GByte VRAM Partition.
Meanwhile I am wondering where I am affected while rocking Shadow of Mordor (a game where the 290x outperforms the 970) @ 2560*1440 res on Ultra with avg 98 frames (with "drops" to 49 frames acc. to the benchmark)

Nvidia may suffer, but the 970 are still a thing of Beauty @ Full HD 1080p and 1440p.
Stay classy, AMD and release sth to compete.

Mr.PayDay fucked around with this message at 03:05 on Jan 30, 2015

Mr.PayDay
Jan 2, 2004
life is short - play hard

This is amazing, I am crying :lol:

Mr.PayDay
Jan 2, 2004
life is short - play hard
I still don't get all that whining. We all agree that nvidia f*cked up, don't we?
"The 970 works as intended"


In other words, the 970 now has far worse tech specs than an AMD 290x but can still match or even top the benchmarks on 1080p and 1440p and can even get 980 stock performance thanks to overclocking: http://www.extremetech.com/computing/190652-overclocking-nvidias-gtx-970-gtx-980-performance-for-a-fraction-the-price


So, who is finally affected?

I mentioned earlier here that I bought two eVGA 970 GPUs for SLI @ 2560*1440 Gaming and I am still looking for a (non Ubisoft) game to be "reproduce" this "problem".
It seems only Ubisoft Games cause problems but this is not a 970 problem; the Failsoft games run really bad on every GPU.

I tested Shadow of Mordor, a game where the AMD GPUS outperform the nvidia GPUs:

1440p is my gaming "scenario" for me:
2560*1440 with ULTRA Settings (+object blur) and 6 GB ultra textures:
avg: 98
max: 216
min: 42

Well, that was the benchmark, ingame I have stable 60-90 frames; this runs superb.

So the 6 GB Texturepack runs on my "3,5 GB" 970s; it caps at 3,5 GB to 3,7 GB but it just works.

I would buy my 970s anytime again for 1440p gaming.
Most people I know still are gaming on 1080p where 3,5GB+ should not be an issue right now.


After I watched tweak PCs videos that showed Shadow of Mordor stuttering on the 290x as well it is obvious that some people want to find problems so hard where they may affect only few players.

Watch this: https://www.youtube.com/watch?v=QCeg6EQKYQo watch 2:07 and a stuttering 290x. Now what happened here? So the 290x has even more memory issues with "real" 4 GB VRAM, is that right?

After all I was not able to reproduce problems on my 970s that would be right called a "970 VRAM problem". I may do something right...or wrong then.

Mr.PayDay fucked around with this message at 14:56 on Feb 5, 2015

Mr.PayDay
Jan 2, 2004
life is short - play hard

NickPancakes posted:

Is there any major reason not to get the EVGA GeForce GTX 970 SSC ACX 2.0+ right now, seeing as it supposedly has fixed their cooling/coil whine according to most people?
I have 2 of them, no coil whine, not any issues.
You can easily oc them. I surpass my gaming buddies with their 980 Stock SLI fps in all games what really annoys them :newlol:
The eVGAs simply own.

Mr.PayDay
Jan 2, 2004
life is short - play hard

Subjunctive posted:

There is a version of the 970 that fixes the memory issue, and it's called the 980.

Whenever you find a scenario where a 970 GPU surrenders to downsampling and stutter, you won't be able to the play just smooth without similiar issues on a 980 GPU.

Mr.PayDay
Jan 2, 2004
life is short - play hard

fletcher posted:

Got my dual 970s installed the other day. Everything runs smooth as butter on my g-sync display, it's really fuckin sweet. Overkill for 1080p? Maybe...but it looks AMAZING.

Great to read that you enjoy your "investment".:buddy: 1080p Overkill? Maybe. But maybe you want to upgrade to a 27" Monitor someday; you are already set/prepared for 1440p now.

Mr.PayDay
Jan 2, 2004
life is short - play hard
My 970 EVGA SLI push Firestrike to 19150, Firestrike Extreme to 10200 and run Witcher 3 at 1440p and Ultra Settings with hairworks at 55-65 fps average.

So I was sure I would skip the 980ti...until I saw the customs. Holy hell.... The superclocked custom ones will mean that 980TI = "Titan Irrelevant", just to make the reference again, and will be faster than my 970 SLIs.
And they have 2 GB VRAM more.
....I meant 2,5 GB VRAM more.


I fear the german superclocked models will be ~ 800 to 850 Euros but I will sell my 970s to buddies for 250 Euro each. Thats means an invest of ~ 1200 Euros to jump from 970SLI to 980TI SLI. :sludgepal:
Yeah, somehow stupid but gaming is my passion and who needs to eat anyway and I wanna flood my sweet ASUS RoG Swift with frames. I must...I MUST

TitanX SLI Goons withj Haswell Architecture, what are your PSUs like? I got an 800 BeQuiet E10-CM and I am wondering if this will be enough or not (I got an I7-5930K oc at 4,3 GHz and Samsung 850 Evo in my rig).

Mr.PayDay
Jan 2, 2004
life is short - play hard

Don Lapre posted:

"The partner did hint at the performance. Apparently the Radeon Fury X ought to be slower than the GeForce GTX 980 Ti."


So this is what happened so far:
Nvidia dropped the mic with the 980TI stunt:

"Hey Gamers!! Here, TitanX performance for 649 bucks!
We just cut 6 GB VRAM and deactivated 200+ shader units. But Partners with their customs and overclocks make sure you get the fastest GPU 2015.
@AMD: Deal with it! :tipshat:
@TitanX Buyers: we hope you understand and support Team Green in the future as well. By buying our new Titan 'Codename N-Fusion ' in 2016. Meanwhile enjoy our Witcher3 gift for ya! :iia: "


Maybe someone at Sapphire is pissed that EVGA and other nvidia custom resellers are already getting rushed for 980Ti sales and printing Money I guess.

The news + leak lockdown and NDA means that AMD is losing money every hour since 00.00 AM June 1st.

You can read and watch what happens in the forums: the branding - neutral gamers are posting stuff like this in the last ~46 hours:
"Hoooly sh*t, TitanX Performance for 649-699 $?! Sorry, dont wanna/ cant wait for AMD any longer, buying 980ti now to play Witcher3 and GTA 5 finally maxed"

Team Red needs to deliver. Something, somehow...

Mr.PayDay fucked around with this message at 21:45 on Jun 2, 2015

Mr.PayDay
Jan 2, 2004
life is short - play hard

PirateBob posted:

What games/benchmarks should I get to test out my new GTX 970 and its OC potential? :D

I already have GTA V, Fallout 4, The Witcher 3, Assassin's Creed Syndicate.

Shadows of Mordor with Ultra Settings, the METRO Games and the Benchmarks should be able to penetrate your new toy :-)

edit: Just Cause 3 will stress your 970 as well.

Mr.PayDay fucked around with this message at 22:39 on Jan 13, 2016

Mr.PayDay
Jan 2, 2004
life is short - play hard
So my 980Ti is fueling my 27" RoG Swift @ 2560*1440 and is doing well.
But even the 980Ti can't keep 60+ fps in newest Games with maxed settings.

And I wanna jump to an ASUS 34" curved 3440*1440 in spring 2017.
I guess I will need the 1080ti to play on Ultra settings and somehow touch 60 fps avg, won't I?

Mr.PayDay
Jan 2, 2004
life is short - play hard

Slider posted:

Kinda depends on what game you are talking about? Is your GPU actually being pegged at 100% load? its possible the game is cpu bound if you're playing an mmo or something like starcraft.

I got an i7-6700k @ 4,4 Ghz, but GTA V, Witcher 3, Just Cause 3 or Dark Souls 3 have some options to burn fps. It is stupid hunting for fps, I know, just my own tech-game in the game.

Just tuning down 1-2 options (hairworks lol) allows beyond 60 fps avg on 1440p easier.

WoW on ultra with 4*MSAA is still 144 fps capped tho, but that 2*SSAA and 8*MSAA(A2C) and CMAA combined tanks the fps below 60 finally in cities.

I can't imagine a custom 1080 beeing "enough" to keep 60+ fps on "maxed" @ 3440*1440.

Mr.PayDay
Jan 2, 2004
life is short - play hard

BIG HEADLINE posted:

I'm fully expecting to lay out something like $2500-3000 for my next build in about 12-15 months from now.
Same here. I will enjoy watching the GPU battles while playing all games maxed on my 980ti with 6700K oc @ 1440p.
A 1080ti custom and a 6 or 8 core Intel CPU should be a serious upgrade in summer 2017, to push fps on a 32" curved G-Sync Monitor.

The new GPU generation should help the 1080p to 1440p jump for everyone.

Mr.PayDay
Jan 2, 2004
life is short - play hard
Can we realistically expect anything else but the 1080Ti from NVIDIA 2017?
The 980Ti was the only high end GPU in 2015 from them, right?
2014: Titan, 970, 980
2015: 980Ti
2016: Titan, 1070,1080
2017: 1080Ti, ?

Coincidence? Pattern?

Mr.PayDay
Jan 2, 2004
life is short - play hard

skooma512 posted:

I got a 1060 yesterday. Dishonored 2 finally runs well. Any other games you guys have been using to benchmark new cards?

Shadows of Mordor, Mafia 3, Batman Arkham Knight, Farcry Primal, Doom, The Division

Mr.PayDay
Jan 2, 2004
life is short - play hard
https://www.computerbase.de/thema/grafikkarte/rangliste/

It's interesting that the 980ti is still the 3rd fastest GPU after Titan and 1080,
Vega and 1080ti will hopefully enlisting close behind the Titan Pascal.

Mr.PayDay
Jan 2, 2004
life is short - play hard
apology for poor english

when were you when amd vega dies?

i was sat at home eating smegma butter when pjotr ring

‘Vega is kill’

‘no’

Mr.PayDay
Jan 2, 2004
life is short - play hard

..btt posted:

Not a direct answer, but I went from a 980ti to a 1080ti on a 1440p 144hz (gsync) monitor, and the most noticeable difference is that the 1080 runs a whole lot quieter. 980ti is likely massive overkill for 1080p, depending on what games you're running.
Why did you upgrade? Any games or mods that limited your experience or did you snag a great deal?
What is your actual performance gain?

I am on the fence, I got the first gen ASUS ROG Swift 27" and while the CPU (i7-6700K@4,4 GHz) still ownes , I recognize that my 980ti can't touch 60fps avg anymore on ultra settings after I began to use Reshade.
Reshade is absolutely amazing and I can never ever play a game without this anymore (the games look SO MUCH better, try this please) but it costs 10-20% frames, depending on the selected filters.
Witcher 3 with Hairworks and every slider on Ultra was still solid 50-60 fps, but with Reshade I am down to 40-50.
Same for Dishonored 2, The Division and the newest Batman and Lara Croft Titles, hence all new AAA Games.

The 1080ti should push me above 60 avg frames in every game with Reshade and on 2560*1440, but 800-900 Euro for 15-20 additional fps?
Yeah, first world problems.... waiting for Volta means 1 more year I fear.

Mr.PayDay fucked around with this message at 23:58 on Jul 31, 2017

Mr.PayDay
Jan 2, 2004
life is short - play hard

DrDork posted:

The proportion of $600+ GPU purchasers may not be growing (or it may be growing too slow to matter--either way, I more or less agree with you there), but the proportion of people who buy an expensive GPU and then buy a matching GSync monitor absolutely has been increasing: a few years ago it was 0%. I have no idea what it is now, but it's more than 0%, and more and more "reasonable" GSync monitors keep popping out, making them somewhat less of the true halo product they started out as. GSync monitors get recommended all the time assuming finances allow, and most people who buy them plan on keeping them for far longer than the attendant GPU that they currently
I was one of the "early adopters" in 2014 and paid about 1000 loving €uros for the first gen Asus 27" PG278Q and it was one of my "best" investments in my gaming life.
My gaming buddies did not believe my praise how awesome G-Sync is and what a difference it makes for the gaming experience. G-Sync makes the experience so smooth, Incan never live without it anymore.
3 years later not only me but 6 other buddies plus my brother switched to NVIDIA and G-Sync. We are all males around 40 years and have disposable income. That "group" does not hesitate to spend 1000-2000 Euro for new highend NVIDIA GPUs and G-Sync Monitors every 18-24 months.
My brother bought the PG348Q that even torments his 1080ti, and holy gently caress is this a thing of beauty.

G-Sync has a "tax" for sure but it is worth it, and everyone who could test it agrees, it makes a difference.
The small group of enthusiasts will grow I claim, because , yes, why care about 150 extra bucks if you can afford mentioned 1000-2000 Euro for GPU and Monitor anyway, especially considering the difference in an experience with or without G-sync ?

Mr.PayDay
Jan 2, 2004
life is short - play hard

Shrimp or Shrimps posted:

Can those with G-Sync panels say if it makes a significant difference if gaming at 60~120fps? I understand the benefits of gaming below 60fps, but to be honest, my neurotic nature would never let me accept that even if it "appeared" smooth.

It does! I can manually cap my Swift at 60 fps/Hz Refresh Rate and switch to 144 max Mode, and I tried this with games like FIFA or Forza Horizon and even 90 or 100 fps feel and look so more smooth and "softer" than 60 fps, it makes a noticeably difference for the perception.

Mr.PayDay
Jan 2, 2004
life is short - play hard

Truga posted:

I get buying GPUs that often, but monitors were kinda stagnating for really long, what exactly did you buy every 2 years??

There is a "limit", indeed, but I jumped from a a 1080/1200p to a 1440p Monitor and 3440*1440 will follow within a year, so that is 3 new Monitors in 5 years.
I guess the usual jump would be from 1080 to 1440 and from there most players just wait.
I claim Volta might be crucial for the push of the transition from 1080 p to 1440p and to 34 inch (and above) and 4K Monitors and VR .
I claim that the year 2017 will be the last year the majority buys a new monitor for full HD gaming.

Mr.PayDay
Jan 2, 2004
life is short - play hard

Gwaihir posted:

Unless you just keep an FPS counter on to stare at it you won't even notice.

Whenever I read this I want to ask "did you even try this"? I don't know if humans have different perceptions or awareness according to realize and recognize that 100 frames feel smoother than 40, 50 or 60.
And maybe it depends on the Hardware/Monitors but I know and can "see" (read:feel) if the same game has 40 or more fps difference on my Swift.
If I did not notice the difference I would not have shown what >60 Hz Refresh Rate feels like and I could not have "sold" 5 buddies and my brother to something "one would not notice" in the first place.
The "wtf" moment of experiencing 100 or more frames of the same game they earlier played on their 60hz Monitors was reality, several times, to everyone of them.

Edit: maybe the pretty smooth G-Sync experience was another deciding factor, not only the fps difference.

Mr.PayDay fucked around with this message at 20:23 on Aug 7, 2017

Mr.PayDay
Jan 2, 2004
life is short - play hard

ArgumentatumE.C.T. posted:

Something something, double blind studies, something placebo something.

Yeah, I am sure G-Sync sells as placebo and not because it works as intended and 144+ Hz Monitors are sold because no one can see a difference

Mr.PayDay
Jan 2, 2004
life is short - play hard

eames posted:

G-Sync works best when the GPU struggles to keep steady framerates, the effect drastically diminishes with high end GPUs like your 1080ti.
If you play a lot of e-sports titles it can even make sense to turn off G-Sync because you'll be pinned at max fps/refresh rate all the time and G-Sync adds a tiny amount of input lag.

You can try the Pendulum Demo to see the effect, it has a mode that oscillates between 45 and ~60 fps.

http://www.nvidia.com/coolstuff/demos#!/g-sync

According to "works best:
A common misunderstanding is that G-Sync "works" only noticeable under 60 avg fps or has a "best case" scenario.
It is just that you experience the impact more different to V-Sync depending on the fps corridor.

G-Sync works always and only if you can max a game at let's say 144 fps, you may not notice any difference and the scenario of e-sports titles with cut effects and settings on low to avoid distractions does indeed not "need" G-Sync.

The incentive of G-Sync is being able to play games maxed out like Witcher 3, ne newest Lara Croft or Batman or Fallout 4 or whatever, (even a1080ti can not push 144 fps on Ultra settings @ 1440p) so you have fps ranges from let's say 50-100 fps , and in this case G-Sync works superior than V-Sync because it is smooth and stutterfree.

Mr.PayDay
Jan 2, 2004
life is short - play hard

Shrimp or Shrimps posted:

In games like Fallout 4 and Witcher 3, the added input lag of Vvsync is not really all that important, though, correct?

On my crappy 60hz panel, when BF1 runs at, say, 75fps I see tearing like crazy. However, when it runs at 100fps, I don't. I would assume that is because there are more frames being sent to the monitor so that, at the exact moment the monitor needs to display a frame, it can display a whole one more often.

Which leads me to my next question: If your display has a faster refresh rate, does it mean that tearing is even more noticeable at frame rates below its refresh rate because it is requesting a frame that is only partially there?

To try and rephrase: Will 100fps on a 120hz screen exhibit more tearing than 100fps on a 60hz screen?

It may just be that I'm not sensitive to it. Like I said, on my 60hz monitor, when pushing 100+ frames in BF1, I don't see tearing at all even though, logically, I know it's happening. But if I frame limit to 60fps (not vsync), or even 75fps, I see enormous amounts of tearing.

When the game producers more frames per second than the Monitor can refresh per second, you get tearing.
So 100 fps on a 120 Hz screen will produce less tearing, because the 60Hz monitor will refresh the next frame before the complete last frame was built.
Your 120 Hz screen can show you all 100 fps because it refreshes faster.
But you got still asynchronous Refresh and Render rates so you might still experience tearing.

Vsync limits the frames to the Hz maximum, that reduces (or usually disables) tearing, but it produces that stutter feeling, and that is where the G-Sync Module shines, it binds the GPU render rate and Monitor Refresh rate in that way that every rendered frame will be shown.

Copypaste from nvidia:
"Q: How does G-SYNC work?
A: Several years in the making, G-SYNC technology synchronizes the display’s refresh to the GPU’s render rate, so images display the moment they’re rendered. The result: Scenes appear instantly. Objects are sharper. Game play is smoother.

Since their earliest days, displays have had fixed refresh rates – typically 60 times a second (Hertz). But due to the dynamic nature of PC games, GPUs render frames at varying rates. As the GPU seeks to synchronize with the display, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) can eliminate tearing but causes increased latency and stutter.

G-SYNC eliminates this tradeoff, perfectly syncing the display to the GPU, regardless of frame rate, leading to uncompromised PC gaming experiences."

Mr.PayDay
Jan 2, 2004
life is short - play hard

Ciaphas posted:

I'm still tempted to try to get my hands on a 1080Ti to replace my 980Ti, which struggles some with 1440P@144 without lowering quality settings. Please stop me from/encourage me into making bad decisions with my money.

I am on the same fence but will keep waiting for Volta because "only" 50 fps instead of 60+ fps on Ultra Settings in recent AAA Games is a gamers first world problem.
So spending 800 bucks for 40 % more fps (hence make a 50 fps @1440p a 70 fps game @1440p) seems a "waste" for me when Volta may only be 7-9 months away.

On the other hand, if you have the money spare / if it's disposable, why not, gaming is a hobby and rational arguments may fail, it is about fun, mate.

Mr.PayDay
Jan 2, 2004
life is short - play hard

I prefer the several Reshade options and filters by the way, they deliver more eyecandy than the ingame settings and kill that brightness-light-mist that feels like a nebulous overlay (levels.fx filter)

Mr.PayDay fucked around with this message at 10:23 on Aug 19, 2017

Mr.PayDay
Jan 2, 2004
life is short - play hard

Kazinsal posted:

ReShade does wonders for a lot of games and I'd rather use that than crank all the settings to ultra. Helps that I have a 1080p144 monitor and not a 1440p144 monitor.

FFXIV has this horrendous grey slightly desaturating colour grade over everything and using ReShade to remove that just makes the game so much more lively.

Once you play with lumasharpen , clarify or/ and levels.fx filters the games look better. I played The Division and Witcher 3 yesterday and holy poo poo, just toggling reshade vs. original settings makes the games indeed more "true" colored, "lively " and "realistic" graphics wise

Mr.PayDay
Jan 2, 2004
life is short - play hard
I play on PC since 1991, and I never had a GPU (or whole system) survive as long and not getting replaced as my 980Ti. I usually replace a card with each generation. While it is a GPU from summer 2015, I can still play all games on 2560*1440 on maxed out Settings.

31 (!) months after 980TI’s launch, there are still only 3 NVIDIA GPUs faster (1080, 1080Ti, Titan P). When did that happen last time within the NVIDIA GPU history?
The 780Ti for example had to face the Titan Black, 970 and 980 within 10 months.

The game benchmarks of a new NVIDIA xx80 card this spring or early summer should hopefully and literally provide 100 % of a 980TI’s performance in the same testbuilds and settings and games at 1440p.
This would mean the 980Ti will have been able to compete at the 2 most common resolutions 1080p and 1440p (sometimes around but mostly way above 60 avg fps) for absurd 3 years without needing to reduce gfx settings in any way.
The fantastic oc potentials mean you could easily use this card for another 1 or 2 years unless game engines and DX12 or whatever tank the fps seriously.

Imho the 980 Ti is the „best“ gaming GPU of the last years performance and money wise, and leading not even close.

Goons opinions on that matter ?

Mr.PayDay
Jan 2, 2004
life is short - play hard
You guys are right, the 970 should be mentioned. For Full HD / 1080p gaming the 970 still is amazingly solid even over 3 years after launch as well, and the 970 was a serious slam dunk on AMD with an impact that AMD could not recover from until today (which is not good for us customers).

Mr.PayDay
Jan 2, 2004
life is short - play hard

Risky Bisquick posted:

The 1070ti is probably faster than your 980ti, so 4 cards. A 290/290X would be pretty close at being great for the money ($200) and longevity.

:doh:
welp, you are right.
Please allow me doing some mental gymnastics and put it this way:
There are 4 cards that outpace the 980Ti but they are placed not in 4 but in fact in only 2 tiers of Nvidia GPU‘s after 31 months that are a step up fps wise:
1.) The 1070Ti and 1080 (very similar performance, 30-40% above 980Ti)
2.)The 1080Ti + Titan P (very similar performance, 70-80% above 980Ti)

Mr.PayDay fucked around with this message at 20:08 on Jan 14, 2018

Mr.PayDay
Jan 2, 2004
life is short - play hard
You guys remember the somewhat exotic GTX690?
This is a Dual Kepler GK104-680 chips-GPU from summer 2012 (999 Euro / 1200 US$) that owned the GPU competition in 1080p and 1440p (2560*1440 was an absolut niche then for early adopters of 27“ monitors) and could run games 4K and even topped the 2013 Titan by a serious margin, only throttled by the double 2 GB VRAM buffer in specific games.

You can find more than a dozen local eBay offers in Germany with most of them listed around 250 Euro (~ 306 US$) and the highest for 350 Euro (~ 425 US$).
:dogbutton:
The 690 still competes with the 980 and 1060 in benchmarks, but it only has 2*2 GB VRAM and the power consumption is way higher and, well, it is a a 6 year old GPU

Did or do any Goons here own a 690?

Mr.PayDay
Jan 2, 2004
life is short - play hard
As someone who can regularly play on my brothers 1080Ti and 3440*1440 on a 34" ROG Swift, I can't understand why anyone thinks the 1080Ti is a "4K/60" GPU.
Of course, if you reduce shaders and shadows and some fps killing features, you are able to gain more fps.
But if you pay a grand for a GPU, why would you turn down settings?
Set all sliders on Ultra and the newest games, add Reshade filters to that (The Division for example looks like triple times better with several additional Filters and it eliminates that dumb milkywhite veil that most games have now; of course it depends on what you like and prefer. I feel that the best possible graphic setting adds to the immersion in open world and Single Player RPG and Exploration Games) that cost another 15 % performance and the 1080Ti is nowhere near 50 or 60 fps at 3444*1440, thats 3 Million pixels less than native 4K Ultra HD resolution at 3840×2160 pixels.

So you already can even force a 1080Ti down to her knees, below 4K, easily.

The 2080Ti seems to be the first card indeed that is almost a 4K/60 GPU in my definition, but just watch the newest Tomb Raider iteration that already forces the 2080Ti RE below 100 avg fps or Assassin's Creed: Origins and Ghost Recon: Wildlands where the 2080Ti FE just flows around 65 fps avg.
So people who thought we get three digit frames at 3440*1440 or even 4K for the next years: Nope.

Now wait for The Divion 2 or Dying Light 2 or the coming new Fall Out etc and other the new AAA Games 2019, they WILL pull the 2080Ti under 60 avg fps with maxed settings and even without Reshade Filters enabled at 4K.
(Edit: We will have to see what fps we might gain with the new technology features like DLSS)


For 27" and 2560*1440 tho (and players who WANT best graphic settings and best possible visual immersion) , the 1080Ti, 2080 and 2080Ti are fantastic and allow avg fps way above 60 avg fps and safe between 80-120 fps, depending on the games, in that scenario they are "future proof" for 1440p/60.
The 980Ti, 1070 and 1080 alrady lost the 1440p/60 fps "battle" if you max out the newest games now.

And again, you will gain fps by tuning down settings, but that should not be the use case where we compare avg fps and resolutions.
We should refer to the max settings the designers and developers allow imo.

Mr.PayDay fucked around with this message at 08:24 on Sep 21, 2018

Mr.PayDay
Jan 2, 2004
life is short - play hard

LRADIKAL posted:

Lot of words to say "In my opinion, the only way a card can be considered a 4K card is if it can run it at 60fps full resolution and max settings."

I find your definition unreasonable.

Is there a business standard „when“ and „how“ avg fps are measured and valid?
My definition - if that’s sth you can agree on - is from the enthusiasm POV:
Maximizing gfx settings for best visual immersion as the devs and designers provide it in their game.
I have many buddies and work colleagues I play Overwatch or PUBG with and they tune everything down. The most common cards are the 970, 1050ti and 1060 there, they value performance over everything.

There are always settings you can sacrifice and won’t notice a difference, it depends on each game and each slider and options.

But if you have to start decreasing gfx values or even disabling options to gain fps on 4K, your card can’t keep 60 fps at 4K in that unique scenario, and that’s my reference.

My brother already lowers shadows and some effects to play beyond 60fps in more and more games. That proves that the 1080ti is not even close a 4K60 GPU from an enthusiastic gfx perspective.

You can play 4K on 980ti as well, and get 60 fps, does that make a 4K card. Rhetorical question, you now where I am getting at.

Mr.PayDay
Jan 2, 2004
life is short - play hard
After 2 days of gaming my Zotac Amp 2080ti had a freeze in AC Odyssey and shredded my Windows10, I could not boot again.
I reinstalled Win10 and when I tried to install the GeForce drivers I got a black screen and freeze again.
I RMA’ed it: defunct RAM, what a surprise :argh:
I got a new Zotac 2080Ti, it is working flawless after 3 days Of heavy stresstests , benchmarks and several hours nonstop gaming and overclocking, so this one seems fine.

Mr.PayDay
Jan 2, 2004
life is short - play hard
My Zotac 2080ti is already up to 1349 Euro at the german Caseking store , the MSI is listet at 1499 Euro, because it is Xmas month and people got money to spend.
It’s absurd.
1080Ti is running scarce so there is no chance to “avoid” the 2080 or 2080Ti anymore soon, if you came late. Used 1080Ti cards are 500-600 Euro at eBay over here already as well.

I usually stop thinking about that money burnt once I am hysterically giggling at the frames the 2080Ti is pushing in every game at 1440p. Just Cause 4 releases tomorrow and knowing I can play this on Ultra in triple digits (unless the engine is AssCreed Odyssey poo poo) makes me :allears:
The transition to a 4K monitor won’t need a system upgrade, that’s the other good news.

I am really curious if and how Raytracing gets implemented and improved in the next Triple A games.

Mr.PayDay
Jan 2, 2004
life is short - play hard
So my second Zotac 2080Ti is dying within 1,5 weeks, RMA again if it freezes again tomorrow; let’s hope for round #3

Adbot
ADBOT LOVES YOU

Mr.PayDay
Jan 2, 2004
life is short - play hard

il serpente cosmico posted:

How is it behaving, out of curiosity?




Two crappy smartphonecam pics that I took immediatley, to "proof" my retailer/seller that I had CPU+GPU Monitors Tools running that show no unusual behaivour, I did not oc the Zotac.
GPU hat 65° Celsius at the moment of the freeze, the 9900K about 70° +/- 2° Celsius on all 8 cores.
The freeze happened while playing Battlefield V after 20 minutes, and before that I got some "flickering" everys other minute as if a "rendercycle" was missing, I don't know how to describe it better.
The system rebooted without problems tho and I could play Battlefield V, Just Cause 4 and Overwatch again and switch from Fullscreen to Windows and so on.

No more flicker so far, these would be my "early warning " next time.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply