Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Skuto posted:

Wut, where does this come from? Denver's performance is great unless you focus on a single micro-benchmark no one (except AnandTech) considers indicative of anything any more. It's one of the fastest non-Apple ARM cores, and remember it has a process node disadvantage compared to those.

It's got what, one design win including NV's products?

Adbot
ADBOT LOVES YOU

Assepoester
Jul 18, 2004
Probation
Can't post for 11 years!
Melman v2
So uh basically going from 120Hz to 144Hz on a high refresh rate monitor is enough to kick NVIDIA's GPUs from their low power state into their full power state when just on the Windows Desktop, no games....

http://www.pcper.com/news/Graphics-Cards/Testing-GPU-Power-Draw-Increased-Refresh-Rates-using-ASUS-PG279Q



Meanwhile ATI has no problems.

Given NVIDIA's pushing of their GSYNC monitors, you would think they would have noticed and fixed this by now

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Watch them blame Microsoft

1gnoirents
Jun 28, 2014

hello :)

Cardboard Box A posted:

So uh basically going from 120Hz to 144Hz on a high refresh rate monitor is enough to kick NVIDIA's GPUs from their low power state into their full power state when just on the Windows Desktop, no games....

http://www.pcper.com/news/Graphics-Cards/Testing-GPU-Power-Draw-Increased-Refresh-Rates-using-ASUS-PG279Q



Meanwhile ATI has no problems.

Given NVIDIA's pushing of their GSYNC monitors, you would think they would have noticed and fixed this by now

I thought gsync only worked up to 120 hz? Might be wrong or outdated there. This was posted here before as well but im guessing its going to get some traction now. I'd think 135 mhz just can't handle it and so it just kicks up to full speed ahead and so it might be a simple driver fix (?). We shall see. EdiT: You know, I've seen this behavior before on my own cards at one point with a korean overclockable panel.

Seamonster posted:

Watch them blame Microsoft

or do what they always do, say nothing at all and quietly fix it a month later with drivers. or if they cant fix it, do even more nothing at all because amd is a festering hole of a company with nothing to threaten them with

1gnoirents fucked around with this message at 15:10 on Oct 30, 2015

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Cardboard Box A posted:

So uh basically going from 120Hz to 144Hz on a high refresh rate monitor is enough to kick NVIDIA's GPUs from their low power state into their full power state when just on the Windows Desktop, no games....
It's better to run at 120Hz than 144Hz anyway, because the framerate is an exact multiple of 24, 30, and 60, all common video framerates, which eliminates any judder or video/audio desynch. It's not like there's a perceptible difference from the lower latency, so all you're doing is driving your monitor and videocard harder for a worse experience.

Captain Yossarian
Feb 24, 2011

All new" Rings of Fire"
Edit: misread

Gwaihir
Dec 8, 2009
Hair Elf

1gnoirents posted:

I'd think 135 mhz just can't handle it and so it just kicks up to full speed ahead and so it might be a simple driver fix (?).

Most people just use Nvidia inspector to turn the card back down to the lowest power state and it works 100% fine.

Bleh Maestro
Aug 30, 2003
I seem to be having an abnormal amount of nvidia driver crashes recently. The driver crashes the screen goes black and then it comes back pretty quick and gives the error saying the driver crashed and then recovered.

I'm on a gtx 970 / i 7 2600k system. How do I troubleshoot this?

repiv
Aug 13, 2009

Start by using DDU to nuke all traces of the nVidia driver and it's settings, and see if a fresh start fixes it.

http://www.guru3d.com/files-details/display-driver-uninstaller-download.html

1gnoirents
Jun 28, 2014

hello :)

Gwaihir posted:

Most people just use Nvidia inspector to turn the card back down to the lowest power state and it works 100% fine.

Ah so probably an easy fix

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Gwaihir posted:

Most people just use Nvidia inspector to turn the card back down to the lowest power state and it works 100% fine.

Sadly, opening up that panel locks my screen up.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

xthetenth posted:

It's got what, one design win including NV's products?

It's in the Nexus 9 (HTC).

AFAIK their future designs were announced to have Denver cores as well, so they still prefer it over licensing ARM's designs.

Gwaihir
Dec 8, 2009
Hair Elf

Subjunctive posted:

Sadly, opening up that panel locks my screen up.

Yea, I'm not including myself in that most people part either, heh. I suspect it's probably Gigabyte being janky more than anything, but attempting to set P states to idle ends up with my screen turning in to a blizzard of gibberish. It doesn't lock the PC (I can make two NVInspector shortcuts, one to set idle P state and one to reset to defaults- I can still use the keyboard to tab down to the default one and mash it, whereupon the screen is restored), but it certainly doesn't work.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Gwaihir posted:

Yea, I'm not including myself in that most people part either, heh. I suspect it's probably Gigabyte being janky more than anything, but attempting to set P states to idle ends up with my screen turning in to a blizzard of gibberish. It doesn't lock the PC (I can make two NVInspector shortcuts, one to set idle P state and one to reset to defaults- I can still use the keyboard to tab down to the default one and mash it, whereupon the screen is restored), but it certainly doesn't work.

Yeah, for me just selecting the video card checkbox turns the screen solid blue, have to reboot. I don't even get to change any settings.

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme

Boiled Water posted:

Afaik Nvidia holds a x86 license which basically says "you can use our technology but never make a real cpu, only pcie add in cards".

Can you give a source for that info? Because afaik current agreement between Nvidia and Intel states that they cannot produce x86 chips or emulate x86 operations a la Transmeta. Who has been stupid enough to give them a x86 license anyway?

edit: my info is from this five year old snippet: http://semiaccurate.com/2010/08/17/details-emerge-about-nvidias-x86-cpu/

Rosoboronexport fucked around with this message at 20:23 on Oct 30, 2015

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

I think I read it in the AMD thread when a discussion came up about the Intel-AMD cross license agreement (which basically fucks AMD in some impolite areas if they try to make a major move).

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.
Intel and Nvidia have a cross patent license, but it excludes x86. Long before the current one, they had a similar one that was needed for Nvidia to make Intel chipsets.

Richardanator
May 8, 2006

Mmmm.....
I posted this in the PC building thread, but it is probably better here.

Richardanator posted:

I'm curious about the differences between the MSI Gaming 970 LE and the regular MSI Gaming 970. From the few sites that I've visited, it seems that the only difference is in the core/ memory clocks. Is there anything else I should be aware of? If I OC, would i be able to reach similar OCs as the non LE version? Newegg's got it right now for $280 vs $330 for the regular version, and for a 60 MHz difference it seems worth it to get the LE. Any comments?

SwissArmyDruid
Feb 14, 2014

by sebmojo
XFX stahp. XFX wat r u doin. https://twitter.com/Shannon_Piel/status/659423187847876608

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

I can't hear you!

Panty Saluter
Jan 17, 2004

Making learning fun!

Richardanator posted:

I posted this in the PC building thread, but it is probably better here.

Oddly enough I have an OC and my wife has a Gaming LE and both appear to have the same clocks. Hers is the multifan "gaming" cooler and mine is reference. I think hers was a couple dollars cheaper.

Gaming LE: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127841

OC: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127835

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

It appears XFX still has a bunch of 290 blowers laying about, just right for all 5 people who are going to CF 390s and haven't watercooled the setup anyway.

Rockybar
Sep 3, 2008

To what extent should the GeForce experience optimised settings be heeded? Are they going to really reduce quality for a solid fps, or are they actually quite sensible?

Rastor
Jun 2, 2001


Because you put it in something like this:
http://www.fractal-design.com/home/product/cases/node-series/node-202-integra-sfx-450w-psu

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

FaustianQ posted:

It appears XFX still has a bunch of 290 blowers laying about, just right for all 5 people who are going to CF 390s and haven't watercooled the setup anyway.

Nahh, it's not sufficient for that TDP, especially choked off like that. For CF I'd want probably a vapor-x or similar on the one with unrestricted airflow and a CLC rig on the blocked one, or a CLC rig on both.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Rockybar posted:

To what extent should the GeForce experience optimised settings be heeded? Are they going to really reduce quality for a solid fps, or are they actually quite sensible?
They're just what someone at NVidia thinks is a good balance. I don't follow them very much, honestly, especially since if you've got a 970/80/Ti and aren't an idiot running at stock still, your actual GPU performance is noticeably higher than what GFE thinks it is.

Wiggly Wayne DDS
Sep 11, 2010



From the ones I've checked they either target 40-50fps, or if they use DSR at all low-ball the res.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

xthetenth posted:

Nahh, it's not sufficient for that TDP, especially choked off like that. For CF I'd want probably a vapor-x or similar on the one with unrestricted airflow and a CLC rig on the blocked one, or a CLC rig on both.

That was my point though, people CFing 390s are going to use a more sensible solution than the insanely bad 290 blower, it's basically only useful if it reduces cost on the consumer side sufficiently to make a CLC worth it.

1gnoirents
Jun 28, 2014

hello :)
Just getting trolled at this point


Lol in the comments
https://www.youtube.com/watch?v=u5YJsMaT_AE

1gnoirents fucked around with this message at 22:44 on Oct 31, 2015

HORMELCHILI
Jan 13, 2010


So how would I go about setting up a dedicated PhysX card? If I use a GTX 970 could I just plug in my old 560Ti for it, provided I have enough power?

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


The 560 Ti probably isn't worth using for PhysX; keep in mind it's still over a hundred extra watts and it doesn't idle as low as modern cards.

Also not all PhysX is hardware PhysX (in fact, almost nothing is, especially these days).

If you just want to say you can, then yeah sure.

SlayVus
Jul 10, 2009
Grimey Drawer

Sir Unimaginative posted:

The 560 Ti probably isn't worth using for PhysX; keep in mind it's still over a hundred extra watts and it doesn't idle as low as modern cards.

Also not all PhysX is hardware PhysX (in fact, almost nothing is, especially these days).

If you just want to say you can, then yeah sure.

Actually if a game uses physx, it can switch between software and hardware if it detects the appropriate hardware.

This gives the benefit of added effects.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Is there a PhysX card that even makes sense to get at this point, though? I mean, the "lowest" card I've seen recommended is a 750Ti, and those are still $100, which seems like a lot to minorly improve the speed/effects of the few games that support it.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

DrDork posted:

Is there a PhysX card that even makes sense to get at this point, though? I mean, the "lowest" card I've seen recommended is a 750Ti, and those are still $100, which seems like a lot to minorly improve the speed/effects of the few games that support it.

There is no point in getting a card just for PhysX, especially since modern cards and CPUs do those effects with a minimal performance hit.

1gnoirents
Jun 28, 2014

hello :)
I saw a next gen physx demo that creamed the current gen that conceivably might make use of a card but even typing this out I realize no, it probably won't matter

Tykero
Jun 22, 2009
I posted this in the PC Building/Upgrading/Parts-picking Megathread, but it might make sense here too:

I'm considering upgrading my GTX 760 to a GTX 980 TI. I play a lot of video games and I like to press "ultra" and be done with video settings, which I can't do any more.
I play in1920 x 1200 resolution, dual monitor (one for gameplay, another usually has a twitch stream or something on it).

Anybody have any recommendations on a particular version?


Also, I'm currently using a Core i5 4670k. Should I consider an upgrade there as well, or is processor bottlenecking not much of a concern here?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
A 980Ti is pretty much overkill at 1200p, but you're 100% guaranteed to be able to put everything at '11' without any worries whatsoever.

As for 'which,' it honestly comes down to the MSI/ASUS/Gigabyte question again. eVGA only if you prize warranty service highly.

And I'm still using a 2500K @ 4.4Ghz and have yet to feel CPU limited in anything. I'm waiting until Skylake-E for a rebuild (or if something vital shorts out on my motherboard), which will mean I've been using this particular box for ~4 1/2 years, and longer if they delay Sky-E.

BIG HEADLINE fucked around with this message at 06:47 on Nov 1, 2015

VelociBacon
Dec 8, 2009

Tykero posted:

I posted this in the PC Building/Upgrading/Parts-picking Megathread, but it might make sense here too:

I'm considering upgrading my GTX 760 to a GTX 980 TI. I play a lot of video games and I like to press "ultra" and be done with video settings, which I can't do any more.
I play in1920 x 1200 resolution, dual monitor (one for gameplay, another usually has a twitch stream or something on it).

Anybody have any recommendations on a particular version?


Also, I'm currently using a Core i5 4670k. Should I consider an upgrade there as well, or is processor bottlenecking not much of a concern here?

If you're only running 1080p and don't plan to upgrade that at all in the near future just get a 970 of your favorite flavor.

e: 1200p, still.

VelociBacon fucked around with this message at 08:54 on Nov 1, 2015

Adbot
ADBOT LOVES YOU

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Tykero posted:

I'm considering upgrading my GTX 760 to a GTX 980 TI

Anybody have any recommendations on a particular version?

I just posted a link in the parts picking thread, Newegg has an EVGA 980ti for $610 after rebate with backplate, mousepad, and free game.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply