Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Yudo
May 15, 2003

Paul MaudDib posted:

AMD has traditionally held the cryptocurrency crown for some architectural reason I don't know. I think it might be that AMDs focus on integer performance while NVIDIA focuses on floating-point performance? Or instructions that AMD implements that NVIDIA doesn't?

If I am remembering correctly you hit the nail on the head. There are a few integer operations in particular implemented on AMD chips that make them much better at SHA-256 algorithms than their NV equivalent. Also, and I may be mistaken, AMD consumer chips have more compute units than their NV counterparts for some reason (shaders, of course, being vastly more relevant for games).

I am not that sophisticated in this arena, so someone may correct me.

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo

Paul MaudDib posted:

AMD has traditionally held the cryptocurrency crown for some architectural reason I don't know. I think it might be that AMDs focus on integer performance while NVIDIA focuses on floating-point performance?

In one.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Yudo posted:

Also, and I may be mistaken, AMD consumer chips have more compute units than their NV counterparts for some reason (shaders, of course, being vastly more relevant for games).

Shaders [shader programs] run in parallel (SIMD) on compute units (driving lockstep groups of 32 or 64 individual compute cores, branded stream processors or CUDA cores) in modern GPU architectures. There's no such thing as a dedicated shader unit anymore, it's all just programs that run on a general purpose SIMD processor.

AMD sometimes refers to the compute units as "shader engines", but this refers to the SIMD compute processors, or an aggregation of SIMD compute processors, along with their associated support hardware (memory units, texture units, ROPs, etc). This is just an architectural decision, having your processors subdivided into smaller groups isn't necessarily better.

e: I'm guessing you know this, but for those reading along, the number of cores is also not a relevant number either. NVIDIA's CUDA cores are significantly more complex/powerful than AMD's stream processors on an individual basis, AMD uses more of them. You can only compare number of cores/stream engines/etc within an architecture. I would guess that's why NVIDIA does better at lower resolutions and AMD catches up at higher resolutions - a greater degree of parallelism (higher resolution) will better exploit a greater number of processors.

Paul MaudDib fucked around with this message at 05:43 on Apr 26, 2015

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Yudo posted:

If I am remembering correctly you hit the nail on the head. There are a few integer operations in particular implemented on AMD chips that make them much better at SHA-256 algorithms than their NV equivalent. Also, and I may be mistaken, AMD consumer chips have more compute units than their NV counterparts for some reason (shaders, of course, being vastly more relevant for games).

I am not that sophisticated in this arena, so someone may correct me.
NV only had barrel shifters on GK110, AMD had them everywhere.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Paul MaudDib posted:

AMD has traditionally held the cryptocurrency crown for some architectural reason I don't know. I think it might be that AMDs focus on integer performance while NVIDIA focuses on floating-point performance? Or instructions that AMD implements that NVIDIA doesn't?

EDIT: Nevermind it's actually because the mining algorithm requires good performance in a 32-bit integer shifting operation that is faster on AMD cards than Nvidia.

MaxxBot fucked around with this message at 06:20 on Apr 26, 2015

Remo
Oct 10, 2007

I wish this would go on forever
Recently Adobe released Lightroom 6 and one of the key improvements is that it can better leverage on the GPU for faster editing.

I am currently using a 6850 and I am happy with it for the games I play (Diablo III and LOL) but it seems like this card does not meet the minimum requirements for GPU acceleration in LR 6, and hence I am not really seeing any improvements in performance after upgrading to LR 6.

I feel like getting a GTX 970 but have been unable to justify it as I am not really a hardcore gamer, but if it gives a significant boost to LR 6 performance then I would give it serious consideration.

Just wanted to ask if anyone here has a GTX 970 and have upgraded from LR5 to LR6, what has your experience been like? Thanks in advance!

sauer kraut
Oct 2, 2004

Remo posted:

Recently Adobe released Lightroom 6 and one of the key improvements is that it can better leverage on the GPU for faster editing.

I am currently using a 6850 and I am happy with it for the games I play (Diablo III and LOL) but it seems like this card does not meet the minimum requirements for GPU acceleration in LR 6, and hence I am not really seeing any improvements in performance after upgrading to LR 6.

I feel like getting a GTX 970 but have been unable to justify it as I am not really a hardcore gamer, but if it gives a significant boost to LR 6 performance then I would give it serious consideration.

Just wanted to ask if anyone here has a GTX 970 and have upgraded from LR5 to LR6, what has your experience been like? Thanks in advance!

http://protogtech.com/adobe-lightroom/do-i-need-a-powerful-video-card-for-lightroom-cc-6/

fuckpot
May 20, 2007

Lurking beneath the water
The future Immortal awaits

Team Anasta
Just bought a couple of 980s after having used AMD/ATI my whole life. I think the last nvidia card I used was that one after the Riva TNT 2. I am really appreciating how much better the software is and can see why people have always preferred nvidia. I was having some trouble getting the DSR to work. I've watched some tutorials and the option used to switch it on simply isn't there. I did read that it will only pop up on supported games but I have tried a whole bunch and none of them allow it. Are games that support it rare? I mainly want it for Pillars of Eternity to increase the detail on the player models. Also in my googling I found that it does support 1440p monitors but it was an old article - maybe that has changed?

edit: running Windows 8.1. Latest drivers and everything is updated.

Groen
Oct 7, 2008
Enabling it should be easy, set the factors in nvidia control panel's 3d settings and those resolutions should be selectable in-game.
Its not a dsr setting in-game, just a resolution.
Only works on real fullscreen apps.

Ragingsheep
Nov 7, 2009

fuckpot posted:

Just bought a couple of 980s after having used AMD/ATI my whole life. I think the last nvidia card I used was that one after the Riva TNT 2. I am really appreciating how much better the software is and can see why people have always preferred nvidia. I was having some trouble getting the DSR to work. I've watched some tutorials and the option used to switch it on simply isn't there. I did read that it will only pop up on supported games but I have tried a whole bunch and none of them allow it. Are games that support it rare? I mainly want it for Pillars of Eternity to increase the detail on the player models. Also in my googling I found that it does support 1440p monitors but it was an old article - maybe that has changed?

edit: running Windows 8.1. Latest drivers and everything is updated.

There's no "DSR - Factors" under Manage 3D settings?

wolrah
May 8, 2006
what?
DSR and custom resolutions are incompatible with each other for some reason, so if you're using custom resolutions to overclock your display DSR gets disabled.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
The real problem is that DSR doesn't work with SLI.

EDIT: You're using a 1440p monitor? Is it a G-Sync one?

https://forums.geforce.com/default/topic/784180/geforce-drivers/system-configurations-currently-supporting-dsr-as-of-driver-344-65/1/

This is a table for what setups are supposed to be able to support DSR:
code:
 						Single GPU		SLI

2.Standard Monitor (25x16 resolution or lower) 	Yes			Yes

3.G-SYNC Monitor (any resolution)		Yes			In Development**

4.Surround Monitor Configuration		In Development**	In Development**

5.4K Monitor -- SST (single wide)		Yes			In Development**

6.4K Monitor -- MST (tiled display)		In Development**	In Development**

7.3DTV Play					Yes*			Yes*

8.3D Vision					Yes*			Yes*

9.Discrete-GPU Notebooks			In Development**	In Development**

10.Optimus Notebooks				In Development**	In Development**

11.MS Hybrid Notebooks				In Development**	In Development**

Sidesaddle Cavalry fucked around with this message at 18:42 on Apr 26, 2015

Haeleus
Jun 30, 2009

He made one fatal slip when he tried to match the ranger with the big iron on his hip.
Has anyone had issues with play any video with VLC after installing the latest nvidia driver (for GTA V)? I'm not 100% if its related but from around the same time I cannot play a single video past 5-10 minutes before the video locks up/goes black while the audio keeps going, and if I try to interact with the program it just shuts off entirely. I tried reinstalling to no avail so I'm thinking it may be driver related.

fuckpot
May 20, 2007

Lurking beneath the water
The future Immortal awaits

Team Anasta

wolrah posted:

DSR and custom resolutions are incompatible with each other for some reason, so if you're using custom resolutions to overclock your display DSR gets disabled.
This would be it, I am running a custom resolution to overclock my monitor. Sorry should have mentioned that. Thanks.

penus penus penus
Nov 9, 2014

by piss__donald

fuckpot posted:

This would be it, I am running a custom resolution to overclock my monitor. Sorry should have mentioned that. Thanks.

You have a qnix fuckpot?

Dr Snofeld
Apr 30, 2009
I'm considering upgrading from my Raedon HD7770 to a 2GB card at some point in the nearish future, but I don't know much about hardware - specifically I'm concerned about accidentally blowing out my PSU by installing a card that drains too much power. Would upgrading from a 1GB card to a 2GB card offer any major performance increase in and of itself? I checked the OP but it seems to be a few years out of date.

quote:

Case: Corsair Carbide Series 300R Case
Power Supply: Corsair CX600 600W Power Supply
Processor: Intel Core i5 3570K 6MB Cache Socket 1155
CPU Cooler: Xigmatek Dark Knight Knighthawk CPU Cooler
Graphics Card 1: Radeon HD7770 1GB GDDR5 PCI-Express Graphics Card
Memory: 8GB Corsair DDR3 1600MHz C9 Dual Channel Memory Kit (2 x 4GB)
Motherboard: Asus P8Z77-V LX Intel Z77 (Socket 1155) Motherboard
Hard Disk Drive One: Corsair Force Series 3 120GB SATA III 6Gb/s Solid-State Hard Drive
Hard Disk Drive Two: Seagate 1TB Barracuda 64MB Cache SATA III Hard Disk Drive
Optical Drive One: DVD-RW 22x
Sound Card: Onboard HD 7.1 Audio

The Iron Rose
May 12, 2012

:minnie: Cat Army :minnie:

Dr Snofeld posted:

I'm considering upgrading from my Raedon HD7770 to a 2GB card at some point in the nearish future, but I don't know much about hardware - specifically I'm concerned about accidentally blowing out my PSU by installing a card that drains too much power. Would upgrading from a 1GB card to a 2GB card offer any major performance increase in and of itself? I checked the OP but it seems to be a few years out of date.

You want the parts picking thread, but it's worth noting that VRAM is way less relevant than how powerful a graphics card actually is on its own.

Get a 970 if you can afford it, or a ~$230 AR r290 if you can't. Below that it's not a fantastic value for your money.

eggyolk
Nov 8, 2007


Sounds like a GTX 960 would be perfect for you. Gives twice the performance for an extra 40 watts.

e: nevermind you have a 600 watt PSU, get an R290 or GTX 970

Dr Snofeld
Apr 30, 2009

eggyolk posted:

Sounds like a GTX 960 would be perfect for you. Gives twice the performance for an extra 40 watts.

e: nevermind you have a 600 watt PSU, get an R290 or GTX 970

Well, I guess that future won't be as nearish as I thought it might be, but I'll certainly make a note of these suggestions for when I am better able to upgrade. Thanks very much.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Dr Snofeld posted:

I'm considering upgrading from my Raedon HD7770 to a 2GB card at some point in the nearish future, but I don't know much about hardware - specifically I'm concerned about accidentally blowing out my PSU by installing a card that drains too much power. Would upgrading from a 1GB card to a 2GB card offer any major performance increase in and of itself? I checked the OP but it seems to be a few years out of date.

These days you want a 4GB card, especially as that seems to be the new standard and lots of future console ports will want over 2GB. The cards already suggested are good choices, I would avoid the 4GB 960s since they have a very narrow memory bus that chokes the whole card, the 970 is well worth it especially if you want to get The Witcher 3 since they are running an offer where if you buy a 970 you get TW3 free when it comes out.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dr Snofeld posted:

I'm considering upgrading from my Raedon HD7770 to a 2GB card at some point in the nearish future, but I don't know much about hardware - specifically I'm concerned about accidentally blowing out my PSU by installing a card that drains too much power. Would upgrading from a 1GB card to a 2GB card offer any major performance increase in and of itself? I checked the OP but it seems to be a few years out of date.

600w is enough PSU for any *single* GPU on the market, you've got nothing to worry about.

For 1080p-and-below gaming I'll echo everyone else, buy yourself a GTX 970. There are tons of them open-box on Newegg lately if you feel like taking a chance on the accessories. You will not get the Witcher bundle if you do that however.

If you are going to do something oddball like 1440p / 144hz / multi-monitor / 4K / etc then you should consider aiming higher, or waiting to see what the 390X is like when it comes out.

There's no real reason to buy a 2GB card unless you're reeeeally scraping bottom. A used 3GB 7950 is like $110 nowadays and that's the bare minimum I'd buy right now. The general consensus is that GPU memory usage of games is headed for a pretty steep incline in the near future.

Something in the GTX 970 / R9 290 / R9 290X class will give you a nice bump in performance and a reasonable degree of future proofing. Again, you may want to wait a month and see what the 390X release does to prices, it should drive them down somewhat across the board I'd think.

Paul MaudDib fucked around with this message at 02:19 on Apr 28, 2015

TomR
Apr 1, 2003
I both own and operate a pirate ship.
This may be a crazy question, but I have a 2560x1440 monitor and a GTX 970. I've been playing Elite but it doesn't seem to have anti-aliasing that works. If I use DSR the I get really really bad framerates. Is there a way to use a DSR mode for 1080p with a non-1080p monitor? It only gives me multiples of the monitors native resolution.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

TomR posted:

This may be a crazy question, but I have a 2560x1440 monitor and a GTX 970. I've been playing Elite but it doesn't seem to have anti-aliasing that works. If I use DSR the I get really really bad framerates. Is there a way to use a DSR mode for 1080p with a non-1080p monitor? It only gives me multiples of the monitors native resolution.
That's not how DSR works. The only reason why DSR looks good is because you actually are rendering a multiple of your native resolution, likely millions more pixels than you would at native resolution. This doesn't get easier if you theoretically exchanged your 1440p monitor with a 1080p monitor, and it's not compatible with downscaling a game like you'd want to, either. (You also wouldn't get a noticeable quality increase, because there wouldn't be enough extra pixels to put together.)

"hello i'd like to order a happy meal, also super size it"

What other methods of AA have you tried so far outside of the in-game options? E: There are third-party tools like nvidiainspector that allow manual forcing of MSAA and the like into games.

Sidesaddle Cavalry fucked around with this message at 05:15 on Apr 28, 2015

Stanley Pain
Jun 16, 2001

by Fluffdaddy
You could also tool around with SMAA injector.

TomR
Apr 1, 2003
I both own and operate a pirate ship.
I've only changed settings in the NV control panel and it didn't seem to do much of anything. In game ranges from jaggies to sparkly jaggies when I change settings. I'll go read up on injectors, thanks. :)

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

DSR is a multiple of native resolution in the sense that aspect ratio is preserved, but it doesn't have to be an integer multiple. I suspect that relatively few people use DSR at 2x rather than 1.25 or 1.5. You can select which DSR resolutions are available in the NVIDIA control panel.

Cojawfee
May 31, 2006
I think the US is dumb for not using Celsius
There's no way to trick your GPU to output at 1080 but really render at something else when your monitor is 1440. When you set up DSR, Windows actually thinks your monitor runs at that resolution. So if I set it up so I can run at 4K, I can actually set my windows resolution to 4K. Windows and any game that supports that resolution will render at that resolution. The GPU itself will downscale back to 1440p for the monitor.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Some engines (Unreal for one) allow the internal rendering resolution to differ from the display resolution, meaning that you could have it basically do 1080p worth of work and then scale it up to display at 1440p. Roughly the opposite of DSR, perhaps "dynamic sub resolution"!

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Subjunctive posted:

Some engines (Unreal for one) allow the internal rendering resolution to differ from the display resolution, meaning that you could have it basically do 1080p worth of work and then scale it up to display at 1440p. Roughly the opposite of DSR, perhaps "dynamic sub resolution"!

This is how all the consoles work, and I think will be the trick to make gaming possible on high-end integrated graphics. I'd gladly play at upscaled 720p on HD 6000 graphics if the UI were somehow able to be proper scale.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I'm reeeeealy curious where the 390X is going to clock in. If it has the rumored 4096 cores that would put it at about a 45% boost in performance over the 290X even ignoring any architectural improvements. At 4K that would basically be Titan X territory.

wolrah
May 8, 2006
what?

Subjunctive posted:

Some engines (Unreal for one) allow the internal rendering resolution to differ from the display resolution, meaning that you could have it basically do 1080p worth of work and then scale it up to display at 1440p. Roughly the opposite of DSR, perhaps "dynamic sub resolution"!

Isn't this basically what happens when you select "GPU Scaling" in the control panel for resolutions that differ from native? The GPU delivers a native res signal to the monitor but the OS thinks it's dealing with something smaller.

SlayVus
Jul 10, 2009
Grimey Drawer

wolrah posted:

Isn't this basically what happens when you select "GPU Scaling" in the control panel for resolutions that differ from native? The GPU delivers a native res signal to the monitor but the OS thinks it's dealing with something smaller.

I actually commend Nvidia for their GPU resolution scaling options. If I am remembering correctly, you can scale old 4:3 games to 4:3 without making it wide-screen. Playing things like FallOut and Diablo at wide-screen was weird.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
is there a comparable benchmark I can run to get an idea of how well I will run witcher 3?
Is witcher 2 on the same engine?

If I need to upgrade my graphics, i would rather do it now while I can still get a free copy of the game.

If there is a better thread to ask this in, please let me know and ill move it

Fauxtool fucked around with this message at 04:11 on Apr 29, 2015

GrizzlyCow
May 30, 2011
Only way to tell how well your system will run Witcher 3 is to play Witcher 3. You got a GTX 970 and an i5, so you're probably fine.

When it comes out, you can check out Guru3d or HardOCP to see if they have a performance review.

This is probably a better question for the parts picking thread, but you won't get a good answer until Witcher 3 is out.

BurritoJustice
Oct 9, 2012

IIRC the game was said to run maxed out 1080p on a 980, but it wouldn't surprise me if a 970 will do basically the same thing.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

GrizzlyCow posted:

Only way to tell how well your system will run Witcher 3 is to play Witcher 3. You got a GTX 970 and an i5, so you're probably fine.

When it comes out, you can check out Guru3d or HardOCP to see if they have a performance review.

This is probably a better question for the parts picking thread, but you won't get a good answer until Witcher 3 is out.

cool.

The last game I had to upgrade for was ffxiv ARR. They had an excellent benchmarking tool released way before the game launch. I wish more games would do that kind of thing. It totally nailed how much power the game needed.

Fauxtool fucked around with this message at 09:23 on Apr 29, 2015

sauer kraut
Oct 2, 2004

BurritoJustice posted:

IIRC the game was said to run maxed out 1080p on a 980

Kinda, they disabled Nvidia HairWorks on their demo boxes to get (mostly) stable 60fps on a 980. That poo poo is a huge hog.
But everything else is fine for 970/980's.

Diviance
Feb 11, 2004

Television rules the nation.

Fauxtool posted:

cool.

The last game I had to upgrade for was ffxiv ARR. They had an excellent benchmarking tool released way before the game launch. I wish more games would do that kind of thing. It totally nailed how much power the game needed.

I agree, that benchmark tool was spot on. Shame they don't put that much thought and effort into their other games on PC...

macnbc
Dec 13, 2006

brb, time travelin'
So I'm upgrading from an old GTX560Ti to a GTX970.

I noticed that the 970 uses PCIe 3.0 where my current motherboard is still PCIe 2.0. I know that it's backwards-compatible, but will I notice any performance hit at all?

Adbot
ADBOT LOVES YOU

Kazinsal
Dec 13, 2011



You won't notice it. PCIe 2.0 is already a huge amount of bandwidth on even an 8x link. A 16x 2.0 link has the same bandwidth as an 8x 3.0 link, and you generally do SLI on 4x or 8x links.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply