Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Agreed posted:

Different cards have different voltage ceilings set by the manufacturer. That slider is non-equivalent from brand to brand, or from SKU to SKU within the same vendor on the same chip.

So it's safe to just increase voltage to whatever the ceiling is?

Adbot
ADBOT LOVES YOU

Animal
Apr 8, 2003

Yes. Every now and then check the slider after reboots because it has a tendency to go back to default with PrecisionX (and Afterburner is the same code I think)

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
Cool thanks.

Big Mackson
Sep 26, 2009

veedubfreak posted:

This might surprise some of you, but there are reports coming out that 20nm won't happen this year. I'm totally shocked and not at all happy with my purchase of 290s :)

http://wccftech.com/nvidia-maxwell-20nm-delayed-late-2014-early-2015/

:hf:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I admit that I do feel pretty validated in having bought the 780, though I feel like springing again for the 780Ti, even after recouping some through letting the nice 780 go, was overkill. I've noticed lately I seem to be playing less demanding games rather than more demanding ones - the titles that push graphical boundaries, at least recently, haven't been things I like to play. Battlefield 4 looks god damned amazing but I have zero interest in the genre of online multiplayer competitive FPS, so I'm way more likely to play something that the 780 could have skunked no problem with its great OC (for a 780 - the 780Ti really did redefine the parameters for what you can get out of GK110).

I'm having trouble thinking of the next game on the horizon that's going to make this thing crank up to its full-bore OC and keep it there, as opposed to trying to get Borderlands 2 to do that by forcing a shitload of SSAA and transparency SSAA onto it. The Witcher 3 looks promising, but I really don't see a lot coming down the pipeline that hits the sweet spot between "super fancy graphics" and "I like this gameplay," which makes the whole thing a bit of a wasteful farce for me unfortunately.

Don't get me wrong, I would need at least a 280x to run games as well as I want to, and preferably a 290 at original fair price or a 780 at its dropped fair price - I very much enjoy some games that push those adequately. But the 780Ti and the idea that I was going to hop right into a G-Sync monitor hasn't panned out as well as I'd hoped, and with so many really enjoyable indie titles not stressing a damned GTX 540M taking quite a bit of my attention lately, it all seems a bit silly, really. But, I'm sure I'll keep finding ways to justify spending too much money on a graphics card with marginally better performance than the more affordable second-best alternative in the lineup, just because it seems to be a pattern of consumption I don't want to break away from - I enjoy all the new tech stuff, I don't really have to justify it; it's just good that I don't, 'cause I couldn't if I had to :v:

Big Mackson
Sep 26, 2009
ask me about buying a 290x to play roguelikes and morrowind.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
If it makes you feel better, the first game I've played on my new 780ti is FTL: Advanced Edition. But between Dragonage 3 and Witcher 3 it's going to be a good year.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


Agreed posted:

I've noticed lately I seem to be playing less demanding games rather than more demanding ones - the titles that push graphical boundaries, at least recently, haven't been things I like to play.

Welcome to a rapidly filling boat.

I'm wondering if this isn't a symptom of a larger trend. Between graphics processors (like all others) potentially reaching physical limits, the revival of small-team game development, and AAA production eating its own tail, we could be entering a time when art complexity takes a back seat to art direction, and that would be a hell of a thing to get the credit for rendering threads like this one more of a curiosity than a current events tracker/discussion board.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

How many other "I have a badass video card, but it spends a lot of time at least 250MHz away from its maximum core OC and voltage 'cause that's more than sufficient to play the cool remake of Strider, or Enslaved: Odyssey to the West, or XCOM, or...)" are there out there? Creative and fun games are so often coming out of indie studios and they aren't pushing the graphics, you know what I mean? Mark of the Ninja was awesome but I didn't need a GTX 680 to play it, the aforementioned GTX 540M - which is, make no mistake, fuckin' garbage - handled it super well.

There's just not a major intersection between games that are interesting to me and games that actually stress a card with seven billion transistors running hard too recently. How many others are in a similar position? I still like it for the times when I DO play the really demanding games, or have a really smooth experience in Saints Row 3 and 4, or the Metro games, all that jazz... I dunno. It's just a luxury item and doesn't purport to be anything else, I just wish that I could give it more stuff to really work with because games that DO have the graphical chops to actually exercise this thing's logic in a serious way look amazing.

I know the tail wags the dog, here; I keep up with release schedules for cards much more attentively than the release schedules for games, hah. It's not that I don't get that it's backwards :D I'd just love to have more pretty things for it to do that aren't super cool demo scene releases like that real-time refraction one. Impressive! But I could probably just load up a youtube of somebody's tri-SLI rig running it and have an even better experience! :suicide:

Animal
Apr 8, 2003

I have SLI'd 780's and spend most of my time playing World of Tanks and Titanfall neither which can use SLI. Basically the only super demanding games I run are the Metro games and Far Cry 3 and I only play them sporadically. They look beautiful and run great at 1600p. BF4 hasn't done it for me so far. I bought Witcher 2 but I am hesitant to play it because I never played part 1, and dont have time to play them both.

I spend very little time at home where I can enjoy my rig, so what I think I will do is play all the non demanding poo poo on hotels with my Macbook Pro (750m) and leave the killer stuff for the SLI.

td4guy
Jun 13, 2005

I always hated that guy.

Agreed posted:

Battlefield 4 looks god damned amazing but I have zero interest in the genre of online multiplayer competitive FPS
Heck, I catch flak from my friends for turning up the graphics in that game with my 680. They say the added visual clutter from high graphics settings makes enemies harder to see, making me more likely to get killed. Turn everything to Low they say. *sigh* It's so pretty!

Big Mackson
Sep 26, 2009

td4guy posted:

Heck, I catch flak from my friends for turning up the graphics in that game with my 680. They say the added visual clutter from high graphics settings makes enemies harder to see, making me more likely to get killed. Turn everything to Low they say. *sigh* It's so pretty!

Well, it is a legitimate complaint. For example morrowind with overhaul mods makes it harder to find things sometimes. (Where is Tax Collector?)

Nephilm
Jun 11, 2009

by Lowtax
Not so recently, Mechwarrior Online devs blocked changing the config files because people were turning off graphical features like dirty cockpit glass and distance blur to give themselves an edge because they're a layer of poo poo over an otherwise very pretty game, but improving the game goes against THEIR VISION and all that jazz.

Star War Sex Parrot
Oct 2, 2003

The Lord Bude posted:

If it makes you feel better, the first game I've played on my new 780ti is FTL: Advanced Edition. But between Dragonage 3 and Witcher 3 it's going to be a good year.
Witcher 3 got delayed until 2015, and along with it any motivation I had to replace my 680 GTX that I sold recently.

veedubfreak
Apr 2, 2005

by Smythe
So last night I decided to delid my 4770k and put my new sli bridge on. Well, the drat thing don't fit, I must have ordered the wrong bridge. Sigh. Delidding my processor was surprisingly easy, clamped it in the vice, tapped the piece of wood 3 times with a hammer, and boom, clean silicon. The Liquid Ultra was surprisingly easy to use too. I didn't even squeeze the tube, there was enough in the cap to get a full spread on the die. Ended up only putting a single video card back in because I'm gonna have to tear it down again when I get the proper sli bridge. Was curious to see how far I can overclock with just a single card.

After delidding the chip and changing over to the EK waterblock, my temps are down from 70C at max to 43... 43 degrees at full tilt. Intel really screwed up with these haswell chips by cheaping out on the tim.

I'd post pics of the process but my iphone is being a bitch and won't let me access the pictures. Stupid Crapple products.

Ignoarints
Nov 26, 2010

veedubfreak posted:

So last night I decided to delid my 4770k and put my new sli bridge on. Well, the drat thing don't fit, I must have ordered the wrong bridge. Sigh. Delidding my processor was surprisingly easy, clamped it in the vice, tapped the piece of wood 3 times with a hammer, and boom, clean silicon. The Liquid Ultra was surprisingly easy to use too. I didn't even squeeze the tube, there was enough in the cap to get a full spread on the die. Ended up only putting a single video card back in because I'm gonna have to tear it down again when I get the proper sli bridge. Was curious to see how far I can overclock with just a single card.

After delidding the chip and changing over to the EK waterblock, my temps are down from 70C at max to 43... 43 degrees at full tilt. Intel really screwed up with these haswell chips by cheaping out on the tim.

I'd post pics of the process but my iphone is being a bitch and won't let me access the pictures. Stupid Crapple products.

Crazy isnt it. Now overclock it back up to 70 degrees.

Also its actually the glue height, so its even stupider. Of course there is no reason not to use CLU/CLP etc if you do delid though

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Just grabbed a 2m old 780 classified for $450. In ready for my oculus rift.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

td4guy posted:

Heck, I catch flak from my friends for turning up the graphics in that game with my 680. They say the added visual clutter from high graphics settings makes enemies harder to see, making me more likely to get killed. Turn everything to Low they say. *sigh* It's so pretty!
This is only true about the foliage/terrain options, which have an objectively negative effect on gameplay (foliage blocks spotting for some retarded reason).

Ignoarints
Nov 26, 2010

TheRationalRedditor posted:

This is only true about the foliage/terrain options, which have an objectively negative effect on gameplay (foliage blocks spotting for some retarded reason).

And airburst :mad: oh im sorry was there a leaf sort of near the center of my screen ?

Dont worry dude crank that poo poo to the max and just play better than them

veedubfreak
Apr 2, 2005

by Smythe

Ignoarints posted:

Crazy isnt it. Now overclock it back up to 70 degrees.

Also its actually the glue height, so its even stupider. Of course there is no reason not to use CLU/CLP etc if you do delid though

Ya, I'm running 4.3ghz right now. Gonna try to push it later. Whats the safe voltage again without worrying about migration? 1.4?

It's funny how much passive cooling 4 radiators do. With my fans basically running at only enough power to actually make them spin my water temp is only 12F degrees above ambient. Video card is running at 43c steady.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
There is no voltage at which electromigration stops completely. We don't have hard numbers for where the 24/7-safe limit is (i.e. lasts 3 years of 24/7 use), but I wouldn't put it anywhere higher than 1.3V even at low temperatures from a watercooling setup.

If Haswell is like Ivy Bridge (and they're the same process, so it probably is), 1.4V lasts ~1.75 years of 24/7 full load.

Ignoarints
Nov 26, 2010

veedubfreak posted:

Ya, I'm running 4.3ghz right now. Gonna try to push it later. Whats the safe voltage again without worrying about migration? 1.4?

It's funny how much passive cooling 4 radiators do. With my fans basically running at only enough power to actually make them spin my water temp is only 12F degrees above ambient. Video card is running at 43c steady.

I've looked into voltages for hours. I have a feeling that because most people are thermally limited and the fact the Intel datasheets don't even mention vcore limitations for Haswell there isn't as much good information as there usually is.

1.20-1.30 is considered "tame" or normal overclocking, 1.35 is where a lot of people stay if they can. 1.35-1.40 is a little fuzzy, 1.45+ is considered death long term. Based on what, I don't know, but I don't want to find out. I run at 1.404 for 4.6, which is what people usually get 4.7ghz with.

veedubfreak
Apr 2, 2005

by Smythe
Set 1.35v trying out 4.6ghz. Was lazy when I set the rig up and just used the "auto 4.3" clock settings that come with the board before.

I'm still simply amazed at just how bad the stock thermal interface is under the heat spreader and how easy it was to remove the lid.

e: 4.7ghz at 1.35 still mwo stable. :P

veedubfreak fucked around with this message at 03:47 on Apr 6, 2014

Ignoarints
Nov 26, 2010

veedubfreak posted:

Set 1.35v trying out 4.6ghz. Was lazy when I set the rig up and just used the "auto 4.3" clock settings that come with the board before.

I'm still simply amazed at just how bad the stock thermal interface is under the heat spreader and how easy it was to remove the lid.

My lid was relatively difficult to remove but I'd do it again. Someday (probably soon) these first gen Haswells will be like "remember those Intels you had to hit with a hammer to get to work correctly"

Edit: This is the wrong thread now but if you run into instability with 1.35 (likely eventually) remember you can set your vrin up a little (1.9-1.2) and your cache voltage (~1.15 good to start) and lower you uncore to 3.6 for stability. Presuming your turbo boost is off

Ignoarints fucked around with this message at 03:51 on Apr 6, 2014

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Star War Sex Parrot posted:

Witcher 3 got delayed until 2015, and along with it any motivation I had to replace my 680 GTX that I sold recently.

This is actually a good thing, because it gives me more time to play Dragonage 3.

Speaking of new games, is crytek working on anything? I can always rely on them to give my GPU a workout.

Monday_
Feb 18, 2006

Worked-up silent dork without sex ability seeks oblivion and demise.
The Great Twist

The Lord Bude posted:

This is actually a good thing, because it gives me more time to play Dragonage 3.

Speaking of new games, is crytek working on anything? I can always rely on them to give my GPU a workout.

Homefront 2, to be released this year. Made by their UK studio instead of the Frankfurt one that made the Crysis games, but it's still using the latest CryEngine.

Lutha Mahtin
Oct 10, 2010

Your brokebrain sin is absolved...go and shitpost no more!

Is there a fix for the AMD driver greying out the "use GPU scaling on this monitor" options? I have a R7 250 on Windows 7. I found a utility that claims to fix it but it's a forum link to a ZIP file on awesome-filez.info or something and I really don't want to mess around with something like that.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Lutha Mahtin posted:

Is there a fix for the AMD driver greying out the "use GPU scaling on this monitor" options? I have a R7 250 on Windows 7. I found a utility that claims to fix it but it's a forum link to a ZIP file on awesome-filez.info or something and I really don't want to mess around with something like that.
You have to be at a non-native resolution for that option to be available.

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:
Re-cased my system, re-did my cooling and now I can't crack 49c on either 780ti no matter how hard I push them. 3 closed loop rads, a whole bunch of Noctua fans and a whole lot less decibels. I also really dig having a fan controller with temperature readouts on the front.



LRADIKAL
Jun 10, 2001

Fun Shoe
Why is each cpu/gpu on its own loop? Just for independant control? Pump noise? But then you're paying for unreliability, right? If one loop malfunctions there goes the whole pc right?

Also that's the coolest looking front controller/display thing I've ever seen. I mean, it looks cool at all. Probably like 200 bucks...

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:

Jago posted:

Why is each cpu/gpu on its own loop? Just for independant control? Pump noise? But then you're paying for unreliability, right? If one loop malfunctions there goes the whole pc right?

Also that's the coolest looking front controller/display thing I've ever seen. I mean, it looks cool at all. Probably like 200 bucks...

Because my system is using CPU closed-loop coolers to cool the GPUs. I might go into 'true' liquid cooling later just for kicks but I went with this because I'm familiar with it, already had some of the parts AND plan on taking this to Quakecon. The way it's set now I don't have to worry about emptying/refilling the reservoir while driving there.

The fan controller is only around $100 and is really nice for what it is. I've got all the fans in my system linked to it and have a pretty accurate reading of the temperatures of the systems that each knob governs.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

KakerMix posted:

Re-cased my system, re-did my cooling and now I can't crack 49c on either 780ti no matter how hard I push them. 3 closed loop rads, a whole bunch of Noctua fans and a whole lot less decibels. I also really dig having a fan controller with temperature readouts on the front.





Whats the point of the two fans on the bottom? When the case is closed they are just blowing air at a metal plate.

Royal Hammer
Mar 26, 2014
GPU newbie here looking for some advice from the pros.

I just bought a R9 270x with 4GB of memory, but I'm considering returning it, and exchanging it for a GTX 760, with only 2GB of memory. (price is similar)

I bought the 270x 4GB because I was led to believe that it might give me better performance for my dual 1920x1080 screens.
But, further research has indicated the 760 would outperform the 270x in many areas, particularly if I'm only using one screen for actual gaming.
Some articles that I've read maintain that the extra memory is worthless, while others maintain that extra VRAM is the bee's knees.

This is the card that I bought: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127775

What are your thoughts? R9 270x 4GB or GTX 760 2GB?

Royal Hammer fucked around with this message at 05:42 on Apr 12, 2014

LRADIKAL
Jun 10, 2001

Fun Shoe

Don Lapre posted:

Whats the point of the two fans on the bottom? When the case is closed they are just blowing air at a metal plate.

Oh poo poo, nice catch!

Ignoarints
Nov 26, 2010
Isn't the back of the mobo where the cpu is always exposed? Every case I've had has a fan slot point at the back like that, although not usually so far away. Actually just reminded me to put a fan there, I forgot about it.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Royal Hammer posted:

What are your thoughts? R9 270x 4GB or GTX 760 2GB?
I'd say you made the correct choice, the cards are very similar in performance and 2GB doesn't feel like enough VRAM in this day and age in my opinion. An R9 280(X) 3GB may have been worth considering depending on the pricing you saw, but I don't think there's a compelling reason to make a change.

Royal Hammer
Mar 26, 2014

Alereon posted:

I'd say you made the correct choice, .... An R9 280(X) 3GB may have been worth considering depending on the pricing you saw, but I don't think there's a compelling reason to make a change.

I'm glad you think so! I was looking at the 280x, but they're all $50-75 above the budget I gave myself. I figure the 270x should be enough to hold me over for at least the next year.

90s Solo Cup
Feb 22, 2011

To understand the cup
He must become the cup



Royal Hammer posted:


This is the card that I bought: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127775

What are your thoughts? R9 270x 4GB or GTX 760 2GB?

I have one of these in 2GB form: http://www.newegg.com/Product/Product.aspx?Item=N82E16814121802.

Like Aleron says, it trades blows with the GTX 760. A good thing too, considering how I picked mine up a couple months back for $40 less than what the 760 was retailing a couple months back ($259). :getin: Sadly, I haven't had any GPU-heavy games (GTA IV demands more from the CPU) to cane the poo poo out of it with to form any definite impression just yet.

If I was in the market for a new card right this second, I'd probably just get a GTX 770 and be happy with the vastly improved performance, especially with dual monitors.

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:

Don Lapre posted:

Whats the point of the two fans on the bottom? When the case is closed they are just blowing air at a metal plate.



There is air flow from down there, albeit not 100% direct. The hard drive cages in the front sit in the opening. The back of the mobo is also exposed, although I can't imagine much cooling happening from the backside. Here is a cleaner image from the case's site. Now imagine that vertical fan spot not being covered with a plate. The bottom fans aren't critical, but I was going for cool air in the front and bottom and all heat output out the top, and a positive pressure system. Any open slots on the case has air blowing out rather than in to help with the dust.

Adbot
ADBOT LOVES YOU

Ignoarints
Nov 26, 2010

Alereon posted:

I'd say you made the correct choice, the cards are very similar in performance and 2GB doesn't feel like enough VRAM in this day and age in my opinion. An R9 280(X) 3GB may have been worth considering depending on the pricing you saw, but I don't think there's a compelling reason to make a change.

I have a vram question. My fuzzy understanding is that you'll basically need more ram when you run higher resolutions and more monitors. Also I've read that when you SLI you only effectively have the ram amount of one card (is this true?). Yet that doesn't seem to stop SLI setups from outperforming top end cards with 2-3x the memory and sometimes even twice the memory bandwidth in the same high resolution and multi monitor tests. I would figure this is when you would see SLI choke and be overtaken, or at least see the lead start to diminish. I've only looked into 660ti SLI and 770 SLI benchmarks compared to Titans and 780 ti's so perhaps this was the case in the past and I just dont know about it.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply