Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Carecat
Apr 27, 2004

Buglord
They are supposed to be releasing 8GB versions of the 900 series, I'd assume not for the 960 but there is little drive for it outside of 4K gaming so they might sit on it until they can refresh the product line and justify keeping the price up.

Adbot
ADBOT LOVES YOU

1gnoirents
Jun 28, 2014

hello :)
*flashback to 760's with 4gb for $400*

Ziploc
Sep 19, 2006
MX-5
If I already have a 144hz low input-lag monitor, am I really going to notice a difference with GSync?

SwissArmyDruid
Feb 14, 2014

by sebmojo

Carecat posted:

They are supposed to be releasing 8GB versions of the 900 series, I'd assume not for the 960 but there is little drive for it outside of 4K gaming so they might sit on it until they can refresh the product line and justify keeping the price up.

7 GB version for the 970.

1gnoirents
Jun 28, 2014

hello :)

Ziploc posted:

If I already have a 144hz low input-lag monitor, am I really going to notice a difference with GSync?

As far as I can tell, yes. Micro stutters and tearing is pretty much constant which you can "prove" yourself by logging some games you play. I would guess that's where the Whoa factor comes into it despite whatever the fps says it is. Now is it worth the price difference? That's the real question.

veedubfreak
Apr 2, 2005

by Smythe
So glad I've never been able to see microstutter. And Tearing isn't really an issue cause I can't push 60+ fps on this triple monitor setup anyway.

WHERE IS THE GOD DAMNED MAILMAN, I WANT MY CARDS.

don't remember if I already mentioned this or not, but here is a neat little script that lets you switch between nvidia surround and extended desktop.
http://hardforum.com/showthread.php?t=1590030&highlight=surround+extended

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Tearing can be an issue at any frame rate, no?

veedubfreak
Apr 2, 2005

by Smythe

Subjunctive posted:

Tearing can be an issue at any frame rate, no?

I thought tearing was mostly when the video card is pushing frames faster than the monitor can keep up.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

veedubfreak posted:

I thought tearing was mostly when the video card is pushing frames faster than the monitor can keep up.

Happens when the frame buffer is modified during scan out, so more common with high frame rates, but don't think it's inherent to that. I don't want to draw a picture and think harder about it, though.

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:
I can notice stutter and it sucks. The fps counter says 100, feels like 30. Especially if you watch the same game on youtube or whatever at 60fps it becomes clear that in your game you aren't getting that smoothness.


That's what the Titan X is for though, right? :v:

Mazz
Dec 12, 2012

Orion, this is Sperglord Actual.
Come on home.
So I've been slowly collecting parts to rebuild my machine, and I have a couple GPU-related couple questions. I'm currently running a GTX 480, it works well still but I feel like I'm starting to notice its age.

How much of an upgrade would a 960 be?
How about a 970?
Are either of those cards physically larger then the 480? I'm roughly at the limits for my current case with the 480.
I don't really have 400+ to throw at a video card at the moment, so I'm not considering SLI (as of now) or a 980.

Mainly I'm curious if its worth holding on to my $200-350, and waiting for the next series iteration as I read NVIDIA is on a 2 year cycle with GPUs, but I really have no clue anymore as I haven't paid attention to the marketplace in a while.

veedubfreak
Apr 2, 2005

by Smythe

Mazz posted:

So I've been slowly collecting parts to rebuild my machine, and I have a couple GPU-related couple questions. I'm currently running a GTX 480, it works well still but I feel like I'm starting to notice its age.

How much of an upgrade would a 960 be?
How about a 970?
Are either of those cards physically larger then the 480? I'm roughly at the limits for my current case with the 480.
I don't really have 400+ to throw at a video card at the moment, so I'm not considering SLI (as of now) or a 980.

Mainly I'm curious if its worth holding on to my $200-350, and waiting for the next series iteration as I read NVIDIA is on a 2 year cycle with GPUs, but I really have no clue anymore as I haven't paid attention to the marketplace in a while.

A 970 would blow your mind. A 960 will be a very good upgrade. The 970 is probably the best price/performance card on the market at the moment, and probably won't change in that aspect for a while.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


How much longer until we see the next big refresh of cards?

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
If I were to get a 970 for 1080p gaming would it be powerful enough to run PhysX in games that support it without too much performance loss or would I need a dedicated PhysX card?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Tab8715 posted:

How much longer until we see the next big refresh of cards?

Since nVidia knows they don't have to rush, they'll probably show at Computex to steal AMD's thunder, and go to market again in or around September like they did with the 970/980.

spasticColon posted:

If I were to get a 970 for 1080p gaming would it be powerful enough to run PhysX in games that support it without too much performance loss or would I need a dedicated PhysX card?

My 970 is pretty much overkill for 1200p, so there's a lot of headroom.

Kazinsal
Dec 13, 2011



spasticColon posted:

If I were to get a 970 for 1080p gaming would it be powerful enough to run PhysX in games that support it without too much performance loss or would I need a dedicated PhysX card?

I don't think "dedicated PhysX card" has been a thing in a long time.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

Kazinsal posted:

I don't think "dedicated PhysX card" has been a thing in a long time.

I'm talking about using a dedicated Nvidia card like say a $100 GTX750 for games that support PhysX. But if a single 970 is powerful enough to run graphics and PhysX at 1080p then I'll probably pull the trigger on one of those cards then.

spasticColon fucked around with this message at 23:30 on Mar 7, 2015

veedubfreak
Apr 2, 2005

by Smythe

spasticColon posted:

I'm talking about using a dedicated Nvidia card like say a $100 GTX750 for games that support PhysX. But if a single 970 is powerful enough to run graphics and PhysX at 1080p then I'll probably pull the trigger on one of those cards then.

Considering how few games have come out with physx recently, it shouldn't be a problem.

USPS sucks at updating the status of my package.

BurritoJustice
Oct 9, 2012

NVIDIA CEO claimed that the card will be faster than even the previous generation dual-GPU flagship product by NVIDIA, the GeForce GTX TITAN-Z. (Referring to the TitanX)

Panty Saluter
Jan 17, 2004

Making learning fun!

1gnoirents posted:

Funny, I have a Gigabyte mobo and I used to get those. Occasionally when I boot it says hard drive read failure as well despite having used two different ssd's and a hdd as my windows drive over the year. I just kind of gave up. Really can't say its the reason I guess but still.

I say used to as it happened far more often in the past and way less now, especially BSOD's with ntoskrnl

Now I'm fiddling with my board and getting some eye opening results.

My CPU has been hitting 80C+ despite being cooled by a Noctua NH-U14S....maybe one fan is too few? More importantly the Vcore voltage is hitting as high as 1.4 despite my telling it not to. Everything I've read says 1.3 is too high as is 80C+. Also the CPU package power topped out at 164W during one Prime95 run...after just a few seconds. :supaburn:


I just don't think any of the relevant clock controls in the BIOS actually DO anything.

veedubfreak
Apr 2, 2005

by Smythe

Incoming 1500 dollar price tag.

Still no mail man. QQ

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Panty Saluter posted:

Now I'm fiddling with my board and getting some eye opening results.

My CPU has been hitting 80C+ despite being cooled by a Noctua NH-U14S....maybe one fan is too few? More importantly the Vcore voltage is hitting as high as 1.4 despite my telling it not to. Everything I've read says 1.3 is too high as is 80C+. Also the CPU package power topped out at 164W during one Prime95 run...after just a few seconds. :supaburn:


I just don't think any of the relevant clock controls in the BIOS actually DO anything.

I'm not super knowledgeable on CPU cooling, but don't you really need at least 1 intake and 1 exhaust fan for proper cooling? Otherwise you're either never getting cool air or never getting rid of any hot air, either solution being your CPU stewing in its own juices. But that's outside the scope of this thread I think.

veedubfreak
Apr 2, 2005

by Smythe

Panty Saluter posted:

Now I'm fiddling with my board and getting some eye opening results.

My CPU has been hitting 80C+ despite being cooled by a Noctua NH-U14S....maybe one fan is too few? More importantly the Vcore voltage is hitting as high as 1.4 despite my telling it not to. Everything I've read says 1.3 is too high as is 80C+. Also the CPU package power topped out at 164W during one Prime95 run...after just a few seconds. :supaburn:


I just don't think any of the relevant clock controls in the BIOS actually DO anything.

That sounds more like either a bad mount or not enough air movement.

Which cpu?

SwissArmyDruid
Feb 14, 2014

by sebmojo
Wait, so they went Titan, Titan Black, Titan Z, and now it's Titan X? Jesus, guys, get your loving naming convention together.

Panty Saluter
Jan 17, 2004

Making learning fun!

veedubfreak posted:

That sounds more like either a bad mount or not enough air movement.

Which cpu?

I5-4670K

I guess I can try re-applying the paste and reseating.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

Wait, so they went Titan, Titan Black, Titan Z, and now it's Titan X? Jesus, guys, get your loving naming convention together.

They did, you see each stage of the naming process is more EXTREME for your REAL GAMER needs. I'm waiting on Titan Alpha, Titan Omega and Titan Tits and Explosions. It's the literary form of putting a gun shaped heatsink on your motherboard.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
It's not the nh-u14s, that's for sure. That's a very capable cooler. Those are the kind of temps I'd expect to see if you didn't use thermal paste at all. Definitely not normal. 1.4v is scary, something weird is going on. First thing I guess is confirm that those readings are accurate by using another program because, honestly, it sounds like the sensors are being misinterpreted.

Then I'd move on to remounting the CPU cooler along with a fresh application of thermal paste.

Beautiful Ninja posted:

I'm not super knowledgeable on CPU cooling, but don't you really need at least 1 intake and 1 exhaust fan for proper cooling? Otherwise you're either never getting cool air or never getting rid of any hot air, either solution being your CPU stewing in its own juices. But that's outside the scope of this thread I think.

He's talking about putting two fans on the CPU cooler itself for push/pull, not the case fans. Otherwise I guess if he were trying to run the cooler as a passive cooler that might explain those temps.

Panty Saluter
Jan 17, 2004

Making learning fun!
Nope, got an exhaust fan on the heatsink. I'm using HwInfo, what else is good?

veedubfreak
Apr 2, 2005

by Smythe
Afterburner will give you cpu temps, along with cpu-z

Still no video cards. WHY MUST YOU FORSAKE ME UNCLE SAM!

Panty Saluter
Jan 17, 2004

Making learning fun!
Yeah, hitting high 60s after just a couple minutes of Prime95 at 3.4 gHz (so not overclocked).

SwissArmyDruid
Feb 14, 2014

by sebmojo

veedubfreak posted:

Afterburner will give you cpu temps, along with cpu-z

Still no video cards. WHY MUST YOU FORSAKE ME UNCLE SAM!

Yeah, I dunno what's up with USPS. I've got a package that's stuck out in Texas for some godforsaken reason, when it was supposed to be here Friday.

Panty Saluter posted:

Yeah, hitting high 60s after just a couple minutes of Prime95 at 3.4 gHz (so not overclocked).

Time to repaste that fucker. Remember, just a dot the size of a grain of rice will do you, because too much will act as an insulator. Doubly so if it's some super-thick stuff that doesn't get fluid until it hits higher temps like Shin-Etsu or some crap, because sometimes Intel processors won't get hot enough to reflow.

SwissArmyDruid fucked around with this message at 04:39 on Mar 8, 2015

Panty Saluter
Jan 17, 2004

Making learning fun!
Will do.

Is Noctua's paste garbage? That's what I used last time.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

veedubfreak posted:

I thought tearing was mostly when the video card is pushing frames faster than the monitor can keep up.

Not quite. It happens whenever the framerate and monitor refresh rate are out of sync, and it doesn't matter which one is too slow or too fast.

To understand why, you have to go back to old-school displays. GPUs and monitors are very simple-minded when it comes to displaying pixels: they almost always update each pixel left to right, top to bottom, in lockstep with a defined clock. This is absolutely necessary when you're talking about controlling the intensity of three electron beams in a CRT that have to keep moving in a fixed pattern, at a speed that's slow enough to not overwhelm the timing hardware but fast enough to keep the phosphors lit. It's not necessary with an LCD, but the old standard generally works, and everything works with it.

At least, until the GPU raises its hand and asks, "so, what happens when the monitor starts refreshing the top-left pixel, I have to follow the clock, and I don't have a brand-new frame ready right this instant?" It has to start out with sending the old frame, because that's all it's got, but when the new frame is ready mid-refresh, there's a choice. Start displaying it now, and the monitor gets the freshest possible information on the screen right away, but the user sees a tear in the image - the top half of the monitor shows the old frame, and the bottom half shows the new frame. Since the GPU's not locked to the refresh rate and is just throwing new frames down the pipe whenever it's got them, this is "vsync off."* The other choice is to keep displaying the old frame for the entire refresh cycle, then start showing the new frame. You get a beautiful tear-free image, long after it was first rendered. Since the GPU is paying attention to the refresh rate and syncing new frames to it, this is "vsync on."

With G-sync and Freesync, the monitor and GPU are freed from the tyranny of that single clock. Instead of having to either tear the image or wait for an entire frame, the GPU can just tell the monitor, "hey, hold on for just a few more milliseconds, so I can get the next frame ready." The instant the GPU puts the frame together, it can tell the monitor to start refreshing again. Presto, no tearing and no delay.

*the "v" stands for "vertical," where the pixel getting refresh data makes the vertical jump from the bottom of the screen all the way back to the top. Even DVI-D and HDMI allow for a pause here, so if there's an electron beam it can make its journey back to the starting point; HDMI uses this "vertical blanking interval" as a convenient place to pack audio data down the video wires. Horizontal sync is also a thing in CRTs, where the pixel getting refreshed goes from the far right of the display to the left of the next line, but we don't care about it here unless we're talking about how awesome old-school SLI setups were.

Potato Salad
Oct 23, 2014

nobody cares


SwissArmyDruid posted:

Time to repaste that fucker. Remember, just a dot the size of a grain of rice will do you, because too much will act as an insulator. Doubly so if it's some super-thick stuff that doesn't get fluid until it hits higher temps like Shin-Etsu or some crap, because sometimes Intel processors won't get hot enough to reflow.

When replacing OEM thermal paste on a EVGA GTX 770 with an ACX cooler, is there a brand of thermal paste that is preferable? Is Arctic Silver fine? Do I need to use something different for the memory chips?

SwissArmyDruid
Feb 14, 2014

by sebmojo

Potato Salad posted:

When replacing OEM thermal paste on a EVGA GTX 770 with an ACX cooler, is there a brand of thermal paste that is preferable? Is Arctic Silver fine? Do I need to use something different for the memory chips?

Arctic Silver works fine, just make sure it's not too old, because I know a lot of y'all still have tubes from way back. Memory chips usually have a pad to contact with the backside of the cooler or some adhesive they used to stick the mini heatsinks onto them, best to leave those be in either case.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Panty Saluter posted:

Will do.

Is Noctua's paste garbage? That's what I used last time.

No, Noctua's paste is good. There's very little meaningful difference between pastes. Noctua's comes in a huge tube though, so don't use too much.

Ziploc
Sep 19, 2006
MX-5

SwissArmyDruid posted:

Arctic Silver works fine, just make sure it's not too old, because I know a lot of y'all still have tubes from way back. Memory chips usually have a pad to contact with the backside of the cooler or some adhesive they used to stick the mini heatsinks onto them, best to leave those be in either case.

How old is too old?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
You guys... I am disappointed.

Panty Saluter posted:

More importantly the Vcore voltage is hitting as high as 1.4 despite my telling it not to.

Check your Load Line Calibration setting and turn it off. LLC works by adding voltage under high stress situations, essentially being an additional offset to the Vcore. It's useful I guess? But it makes your Vcore higher than you think.

Panty Saluter
Jan 17, 2004

Making learning fun!
Will do :)

Also I just remembered that I oriented the HSF so that it blows upward, theoretically taking advantage of natural convection. Maybe I should have it blow front to back since I might be drawing hot GPU air. Hrm.

Adbot
ADBOT LOVES YOU

GokieKS
Dec 15, 2012

Mostly Harmless.
Also remember that Haswells have voltage boost for AVX instructions that you can't get around unless you manually set voltage, so if you're testing with something that uses it (like Prime95), that could very well be the reason for the higher-than-expected voltages.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply