Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Bridgeless crossfire works well, but I think that's a newer development than the 7870, and in general it's usually better to just get a new card.

Adbot
ADBOT LOVES YOU

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS

sout posted:

I've been thinking about getting a second 7870 for Crossfire but the more I research about it the more potential issues I'm finding, for example:
how the hell are you supposed to cool the top card if the fan is basically cut off by the other card
potential problems with microstuttering
not having enough PCI-e adapters in my PSU (which I think could be fixed relatively easily with some adapters anyway).

I dunno, I guess since it's such an old card at this point it wouldn't be a great idea anyway. Having a single, more powerful card seems like much less hassle.

Yeah, at this point a 7870 is not strong enough as to be not worth doing anything with as far as recent games, and two of 'em isn't much better. If you're on a serious budget and like AMD, a good used 7950 is still viable to play newish stuff at 1080p w/some eye candy turned down, and they're only about a hundred bucks used.

I put one in my secondary box and it ran GTAV surprisingly well for a four-year-old card.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
Yeah I'd sell the 7870 for whatever you could get out of it and look for a used 7950, 7970, 280 (AMD), or 280X instead. Some of the used 280's or 280X's may even still have a little bit of warranty available for the MSI, ASUS, or Gigabyte models.

Odette
Mar 19, 2011

Since NVIDIA are pushing all future drivers via GFE, are there any plans for a public non-NVIDIA repository of drivers? Because I don't see a way of rolling back if the drivers are FUBAR.

Star War Sex Parrot
Oct 2, 2003

Odette posted:

Since NVIDIA are pushing all future drivers via GFE, are there any plans for a public non-NVIDIA repository of drivers? Because I don't see a way of rolling back if the drivers are FUBAR.
I imagine Guru3D will still find a way to have standalone driver installers.

SlayVus
Jul 10, 2009
Grimey Drawer
It's not impossible to grab the installation file from GFE.

EoRaptor
Sep 13, 2003

by Fluffdaddy

Odette posted:

Since NVIDIA are pushing all future drivers via GFE, are there any plans for a public non-NVIDIA repository of drivers? Because I don't see a way of rolling back if the drivers are FUBAR.

Only game ready drivers are going GFE exclusive. nVidia will still be posting quarterly drivers on their website.

If you want some previous game ready driver, though, you are poo poo out of luck unless you trust a third party website to 'extract' it.

dissss
Nov 10, 2007

I'm a terrible forums poster with terrible opinions.

Here's a cat fucking a squid.

SlayVus posted:

It's not impossible to grab the installation file from GFE.


Infact GPE seems to keep every driver ever in both its compressed and uncompressed forms - kinda annoying if you have a small SSD.

repiv
Aug 13, 2009

nVidia has finally acknowledged the broken power saving with high refresh rates and says a driver fix is coming.

1gnoirents
Jun 28, 2014

hello :)

nVidia posted:

We checked into the observation you highlighted with the newest 165Hz G-SYNC monitors.

Guess what? You were right! That new monitor (or you) exposed a bug in the way our GPU was managing clocks for GSYNC and very high refresh rates.

As a result of your findings, we are fixing the bug which will lower the operating point of our GPUs back to the same power level for other displays.

We’ll have this fixed in an upcoming driver.

just... evil

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

1gnoirents posted:

just... evil

What? That doesn't make sense either way.

Odette
Mar 19, 2011

It's been a known bug for >120 Hz monitors for quite a while.

NVIDIA are only fixing it for PR reasons, to be honest.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

What's the new PR dimension?

Odette
Mar 19, 2011

Subjunctive posted:

What's the new PR dimension?

NVIDIA posted:

"We checked into the observation you highlighted with the newest 165Hz G-SYNC monitors."

Sounds to me like they're trying to avoid associating high power draw with gsync.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

xthetenth posted:

What? That doesn't make sense either way.

It's morally wrong of Nvidia to do anything that makes their products better because AMD is in such a bad position market share wise. Stuff like this just shows that Nvidia are a bunch of bullies beating up on poor, defenseless AMD! How dare they! :qq:

At least I believe that is the logic being used here.

Daviclond
May 20, 2006

Bad post sighted! Firing.

AVeryLargeRadish posted:

It's morally wrong of Nvidia to do anything that makes their products better because AMD is in such a bad position market share wise. Stuff like this just shows that Nvidia are a bunch of bullies beating up on poor, defenseless AMD! How dare they! :qq:

At least I believe that is the logic being used here.

The quoted text specifically is a really smarmy PR-ish response given that the issue has been around for ages and affects pretty much all (?) >120Hz monitors and certain multi-monitor configurations. To suddenly pip up with a "Guess what? You were right!" and imply it's a G-sync only issue has the feel of greasy corporate bullshit, doubly so given that the problem was ignored until the tech media started writing about it.

I'm all for the issue getting fixed though :)

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Daviclond posted:

The quoted text specifically is a really smarmy PR-ish response given that the issue has been around for ages and affects pretty much all (?) >120Hz monitors and certain multi-monitor configurations. To suddenly pip up with a "Guess what? You were right!" and imply it's a G-sync only issue has the feel of greasy corporate bullshit, doubly so given that the problem was ignored until the tech media started writing about it.

I'm all for the issue getting fixed though :)

Yeah, it's standard corporate fare, and frankly it is why AMD would be a huge loss if it folded, but evil's a bit far.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Subjunctive posted:

What's the new PR dimension?

PCPer calling them on it.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

SwissArmyDruid posted:

PCPer calling them on it.

Oh, well, if anyone here works at a company where having something show up in the press doesn't sometimes cause reprioritization, let me know. I'd like to come on a tour, maybe get a picture taken with the monks.

SwissArmyDruid
Feb 14, 2014

by sebmojo
I suppose the distinction is that it was PCPer, instead of, say, bunch of dudes on Overclock.net forums.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

Kazinsal posted:

Fan curve's gotta be around 50% at 50 C and 100% at 80 C. Hope you have noise cancelling headphones! :getin:
days late, but I have a reference blower R9 290 and with Catalyst set to 52% fan it works fine with only rare minor throttling. Keeps it at around 91*C. It's kind of loud, but not as loud as when a program set the fan to 100% and almost killed me. Doesn't bother me at all with normal headphones on and gaming.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
WCCF link but it's using data from TPU

Is there anything to corroborate this being a bit blown out of proportion regarding improved performance on Win10? I thought it was known the 290X is marginally better than the stock 970 at higher resolutions?

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

FaustianQ posted:

WCCF link but it's using data from TPU

Is there anything to corroborate this being a bit blown out of proportion regarding improved performance on Win10? I thought it was known the 290X is marginally better than the stock 970 at higher resolutions?

It's WCCF, of course they are blowing it all out of proportion. The Fury X is only ahead at 4k, and only compared to the stock 980 Ti, anyone buying a 980 Ti is OCing it if they have half a brain, and at that point it blows right past the Fury X. On the other hand the Fury X does not OC well at all, so it still makes much more sense to go with Nvidia at the top end.

It is good to see AMD improving their performance, they need to if they are ever going to claw any market share back from Nvidia, but there is a long, long way to go. I don't expect to see AMD really gaining much in market share until they have cards that outperform Nvidia's stuff by 15%-20% or more.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I really hope the new Radeon Software update later this month improves DX11 performance by 10% - tall order, but making those kinds of gains is a great setup for Greenland, and it's always nice to be able to point to old hardware aging well for PR reasons.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

dissss posted:

Infact GPE seems to keep every driver ever in both its compressed and uncompressed forms - kinda annoying if you have a small SSD.

Is the exact folder for these known? My main system drive is only 128 GB, so I'd prefer not to have these piling up. The old manual installs would appear under C:\NVIDIA, with a folder for each driver build. Now the closest I could find was C:\Program Files\NVIDIA Corporation\Installer2, but wanted to check before I start deleting things since the files are separated by date instead of driver build.

dissss
Nov 10, 2007

I'm a terrible forums poster with terrible opinions.

Here's a cat fucking a squid.
The installers are under ProgramData\NVIDIA Corporation\NetService\


Each of those folders with the long identifier contains the installer exe

The Illusive Man
Mar 27, 2008

~savior of yoomanity~
That's weird, the only thing in there for me is NVNetworkServiceAPI64.dll.

e: Looks like you can safely delete the Installer2 folder contents anyway, so there goes a couple gigs of wasted space.

The Illusive Man fucked around with this message at 19:53 on Nov 8, 2015

The Iron Rose
May 12, 2012

:minnie: Cat Army :minnie:

dissss posted:

The installers are under ProgramData\NVIDIA Corporation\NetService\


Each of those folders with the long identifier contains the installer exe


Sweet, thanks for this, you just saved me like 9 gigs.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
If you head to https://ninite.com/ and get the installer for WinDirStar, that program is like two clicks to get a graphical representation of your hard drive space and makes it really easy to find cached Nvidia/AMD driver downloads.

SwissArmyDruid
Feb 14, 2014

by sebmojo

FaustianQ posted:

WCCF link but it's using data from TPU

Is there anything to corroborate this being a bit blown out of proportion regarding improved performance on Win10? I thought it was known the 290X is marginally better than the stock 970 at higher resolutions?

WCCFT is bad, and you should feel bad for falling for their bullshit.

The way those graphs work is that those percentages are normalized to the card being tested. Comparisons from other cards to eachother are meaningless. Even more so, as the testing rig has changed and the corresponding variables cannot be accounted for.

repiv
Aug 13, 2009

They also removed Project CARS from the test suite between those reviews, which alone probably accounts for a percent or two average speedup on AMD cards.

https://www.techpowerup.com/reviews/AMD/R9_Nano/20.html

repiv fucked around with this message at 02:07 on Nov 9, 2015

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

repiv posted:

They also removed Project CARS from the test suite between those reviews, which alone probably accounts for a percent or two average speedup on AMD cards.

https://www.techpowerup.com/reviews/AMD/R9_Nano/20.html

In fairness PCARS doesn't belong in a benchmark suite as literally all it is an indicator of is performance in PCARS.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I was less taken and more incredulous, but I was hoping that AMD was fixing their DX11 drivers, which will still be relevant TYOOL2017.There is basically no reason to not capture 50% of the nearly doubled performance that they get out of DX12 for DX11.

codo27
Apr 21, 2008

Using shadowplay with Geforce experience to stream to twitch, is there any way to stop it from changing your channel title to [game title] powered by GeForce?

Ragingsheep
Nov 7, 2009

xthetenth posted:

In fairness PCARS doesn't belong in a benchmark suite as literally all it is an indicator of is performance in PCARS.

Can't you say the same for most benchmarks/games?

Or is PCARS still heavily biased towards nVidia cards?

GrizzlyCow
May 30, 2011
Actually, I think whatever the gently caress was happening has been fixed. In the [H] R9 Nano review last week, the R9 Nano outperformed a Factory OC'd GTX 970 which is about its expected performance.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Ragingsheep posted:

Can't you say the same for most benchmarks/games?

Or is PCARS still heavily biased towards nVidia cards?

I didn't notice it changing. If it's better than "lol 770 beats 290" now, then at least it isn't an actively bad choice, but it's still kind of a dubious choice because there's plenty of games that generally have worked mostly right for both brands from the start.

Durinia
Sep 26, 2014

The Mad Computer Scientist

codo27 posted:

Using shadowplay with Geforce experience to stream to twitch, is there any way to stop it from changing your channel title to [game title] powered by GeForce?

Install OBS?

:smugbert:

1gnoirents
Jun 28, 2014

hello :)

codo27 posted:

Using shadowplay with Geforce experience to stream to twitch, is there any way to stop it from changing your channel title to [game title] powered by GeForce?

I noticed that as well when I went to the beta, but not with the non-beta version. Unfortunately right now I didn't see any obvious way to remove it in beta. And the beta is such an improvement I don't want to go back either.

^^ I like OBS and if you're actually going to stream seriously you have to use something like that for the overlays and whatnot, but to me its not worth the overhead difference.

Adbot
ADBOT LOVES YOU

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

1gnoirents posted:

^^ I like OBS and if you're actually going to stream seriously you have to use something like that for the overlays and whatnot, but to me its not worth the overhead difference.

If you're talking about the overhead for encoding, you can set OBS to encode using the GPU which should be the same as using Shadowplay. Even better, you can go into the BIOS and activate Intel Integrated Graphics, and use Intel QuickSync, which causes like no overhead whatsoever since the IGP can be completely dedicated.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply