Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
I ordered one of those evga acx cooler 970s just now because it and the gigabyte were the only decently priced ones in stock with prime shipping and I'll never touch another gigabyte card if I can help it.

It was a bit of a rush purchase though, FedEx tried to destroy my desktop to the point where all the HDD caddies were somehow shaken loose and my four year old drives spent the trip bouncing against each other. Miraculously the drives all survived, even if it knocked a few months of life off them, but my gtx 670 did not.

After reading this thread and the part picking thread and hearing about how the cooler is either subpar or bad, I'm worried I made a terrible mistake. What should I be expecting with this card? Is it bad enough to consider returning it or is it just not as good as the other custom coolers?

Adbot
ADBOT LOVES YOU

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
In that case I'll probably be fine, the 670 it'll be replacing had a reference cooler and my case is pretty well ventilated. I also don't generally put in the effort to overclock so I'm fine with leaving it on the factory overclock.

I was about due for at least a GPU upgrade, even if I probably wouldn't have bothered this generation without this happening. For reference the CPU is an ivy bridge i5-3550 still using the stock cooler.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
My evga acx 970 should arrive today (should've been Friday but ontrac loves to claim Amazon didn't give them my package). If the card is unacceptable to me I'm not going to wait for evga step-up program and I'll just return the card for the MSI or Asus versions. On the other hand if it's not too bad it won't be worth the hassle of returning it.

That's what I get for not checking appropriate threads before making a purchase. I'd always assumed evga normally the best manufacturer for nvidia but their handling of this issue has been really bad. Not to mention reusing an old cooler design for a much larger GPU.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
After a bit of messing around I'm fairly content with the evga acx 2 970. If I'd known about the issues with the cooler and the cheap power delivery beforehand I would have bit down and bought the Gigabyte or one of the non-flagship MSI/ASUS cards, but the difference isn't worth the hassle of returning it. Thankfully my case completely muffles the coil whine and I don't have it as bad as some people do but the issue definitely exists.

It's not a very demanding game but Valkyria Chronicles was the first game I could run with 4xDSR and not have the UI shrink to the point it was unplayable. Absolutely gorgeous, and I never caught it dropping a single frame even while recording.



They've got a $10 mail in rebate going on for anyone making the mistake of ordering one in January, which I'm just barely not eligible for. As much as it's the worst 970 it's also currently going for $10 cheaper than its competition on Amazon, with the $10 MIR on top of that, which could be tempting if you're willing to gamble on the MIR and accept shittier temps.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

BurritoJustice posted:

The second one is quite clearly exactly the same cooler from the Anandtech review that is poo poo

Scroll down the page and see the cooler shot, it has those heatpipes.

The curved around, poorly contacting heatpipes were on the original 970 ACX 1.0, these new cards have the large nickel contact plate that is clearly in that picture on the FTW.

The ACX 2.0 on my SC has the curved heatpipies.

It's actually worse than the FTW version Anandtech reviewd. The FTW which has a fourth heatpipe, the new ACX 2.0 with the straight pipes still only has three.

When even EVGA's marketing only claims a 6% improvement I won't bother with the step-up program. I'm just going to return it to Amazon once a replacement MSI card arrives. I waffled back and forth for a while over whether I was going to bother with the return or not, but seeing such an inefficient and lazily redesigned cooler just tells me that EVGA doesn't care in the slightest.

Desuwa fucked around with this message at 22:36 on Jan 13, 2015

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

BurritoJustice posted:

So there are four EVGA coolers, ACX, ACX 2.0, ACX 2.0 that came with the FTW, and ACX 2.0 that comes with the SSC (which has less heatpipes?). EVGA's 6% is comparing the new one (the 4th) to the curved heatpipes ACX (the first or second) instead of the 3rd which by all accounts is better than the 1st and 2nd while still being lovely. By all accounts they are all lovely. Hilarious.

As far as I know all the non-FTW cards have three pipes, where the FTW versions have four. It's their attitude more than their technical incompetence that has me going through the motions to replace it.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
Switched in an MSI 4G Gaming 970 to replace my EVGA 970 SC. while I'll give the EVGA a tiny amount of credit for having a higher stock overclock the MSI runs 12-15C cooler at both idle and load. After the BIOS update I'm not sure if I ever heard the EVGA card making noise over my Intel stock cooler, but I can't hear the MSI even after upgrading to a much quieter noctua cooler. Definitely blows away the "6%" improvement I would have gotten with EVGA's step up program.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Megasabin posted:

I just switched graphics cards from an ATI 7970 to a GTX 970, and am struggling to regain my 3 monitor setup due to different port configurations. On the 7970 I had a 3 monitor set up with 1 display port cable, and 2 DVI cables. I can't use this setup, because the GTX 970 only has 1 DVI port. One of my monitors can do display port, but the other two can only do DVI. Is my only option to regain third monitor use to buy an active DVI-Displayport Adapter? Am I correct in thinking a passive adapter won't work here?

Which 970 did you get that only has the one DVI port? Always make sure the video card has the ports you expect before buying it. But yeah, you're correct that you'll need an active adapter.

Personally though, I can't stand displayport on my monitors. Windows treats me powering off my monitors as disconnecting them and scrambles all my windows and icons around (not that I have too many icons). DVI is blessedly dumb and doesn't send the same information through the cable.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
Replaced my EVGA ACX 2.0 card with the MSI Gaming card more because I was upset with the way they were handling it than being unhappy with the card itself. Noise was basically unchanged (I had EVGA's newer BIOS which does park the fans at idle) but temps dropped a good 10-12 degrees under load.

The cards are the same price (actually the MSI was $5 cheaper when I bought it) so there is no reason to buy the EVGA card which is objectively worse. On the other hand, if there's a price difference or the other cards aren't in stock, buying the EVGA card isn't the end of the world either. No one should be recommending it though, and recommending against it in favour of the other brands just makes sense.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
If they were completely upfront about the issues I'd still have bought the card no problem, probably at the same price. Not that my dead 670 gave me the choice to wait for AMD, but I've been burned too many times by ATI/AMD to give them the benefit of the doubt. Higher power consumption is also a negative of the AMD cards for me, since I have relatively expensive power and leave stuff running 24/7 until it fails. I do expect AMD to capitalize on this, and good for them I guess.

The way NVIDIA is handling it isn't very good though. There's no loving way that not even one of their technical engineers read anandtech or other tech sites. Someone should have noticed this when the first reviews went up. I can believe it's an honest mistake that the wrong numbers went out, but they have to have realized there was a problem with their marketing material pretty early on and just sat quiet about it for months.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Subjunctive posted:

:eyepop:

I have 2 970s in my new build, and they sit at 60C and 43C in the WIndows desktop. The hotter one (primary) has the fan at about 15%. I've OC'd them to +150 core with +87mV voltage increase, but I don't think that should really matter when they're effectively idle. Is there anything I should look into before just digging into airflow in the case?

Which 970s do you have? Those aren't dangerous temperatures, and I wouldn't be worried about them, but they are higher than I'd expect for the MSI/ASUS custom coolers. One running hotter than the other is expected though; the bottom card is going to be blocking the airflow of the top card and there's not much you can do about it.

If you have room you could try moving the bottom 970 to a lower slot, provided it's going to offer the same number of PCIe lanes.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
For my usage, given how much I game and the relatively high price I pay for power, running an AMD card would probably run me at least an extra $30 or $40 per year, based on it pulling an extra 150 watts. That's enough that it does make them less attractive purchases; over two or three years any savings are gone.

AMD has burned me too many times in the past, but assuming I wasn't biased against them an extra $90-120 cost of ownership over the lifetime of the card would something I would consider. Also with or without an efficient cooler, it's still going to dump more heat into my room in the middle of Summer. AMD would need to be significantly better in some way for my use case, and right now they just aren't.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Jago posted:

That's only when it is running at full power. The difference in power consumption between them while idling or web surfing is relatively insignificant.

http://www.anandtech.com/show/8568/the-geforce-gtx-970-review-feat-evga/15

Yes, I took that into account, I assumed the difference was 0 at idle. I pay 20 cents per kwh, and I push my cards a lot, with either gaming or video playback (NNEDI3 upscaling will push any card). If anything I probably lowballed the time spent at load over the course of a year.

This isn't a critical difference, it's not the first thing I look for or care about. But if the cards were otherwise equal for me, with the same performance and same amount of annoying quirks, this would be enough to push the decision towards NVIDIA. It's not that I can't afford the extra power, but I see no need to when I'm getting a card that is, at best, on par and usually worse for my usage. Of course this is a moot point because AMD cards have given me no end of issues. NVIDIA may have used up a lot of my goodwill with their dishonesty over the 970 but they'd have to do it once or twice more to be on par with AMD.

I also don't trust AMD to support cards as they get older, and that's a larger factor for me. AMD tends to stop caring about how badly they break their drivers for cards that are older than a year or two, even if they're still on the supported cards list.



e: Actually the 10W difference at idle is higher than I expected, but I was unfair saying the AMD cards were 150W higher under load. However if I cut the difference under load in half and include the difference in idle power it comes out about the same cost per year. Even if a 290x is $50 or $60 cheaper up front, it performs worse and the difference in price will disappear inside of the first two years.

Desuwa fucked around with this message at 13:35 on Feb 7, 2015

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Mr.PayDay posted:

I have 2 of them, no coil whine, not any issues.
You can easily oc them. I surpass my gaming buddies with their 980 Stock SLI fps in all games what really annoys them :newlol:
The eVGAs simply own.

NickPancakes posted:

Amazon is where I have money up that I want to drop. They are out of stock on the MSI GAMING 4G right now, should I hold out on that coming back in stock? Is there any major reason not to get the EVGA GeForce GTX 970 SSC ACX 2.0+ right now, seeing as it supposedly has fixed their cooling/coil whine according to most people?


Even their newest cooler is still behind the MSI/ASUS coolers. So at the same price MSI and ASUS make the better cards, but if you have brand loyalty (their customer service is supposed to be good) or you get it at a lower price it could be okay. If you're unwilling to overclock then the EVGAs might have an edge due to higher stock clocks, but overclocking any 970 is really easy.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

SinineSiil posted:

My friend has an issue with is Gigabyte GTX970. He gets BSODs while playing Guild Wars 2. Minidump says it's Nvidia Driver nvlddmkm.sys that causes this. He only gets these when in less demanding least crowded areas. Also temperatures go unusually high there as well. Normally they are 55 degrees, but in less demanding (at least we assume) places they go as high as 69 degrees.

Sounds like it's just rendering useless frames as fast as it can and overheating the card. I wouldn't have expected it in GW2 but it's actually a pretty common problem, I know Galactic Civilizations 2, HoMM 5, and a bunch of others had/have the same issue. The card still shouldn't be crashing at stock clocks, so he should RMA it, but once he gets a new card that doesn't crash he might want to force a frame limit for GW2 to stop it from heating up so much.

Make sure it doesn't crash before setting a limit though, you want to make sure the card is stable under the worst case at stock clocks so it doesn't crop up in other games.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

They're being way too nice to NVIDIA in that article. The incorrect specs stayed up for months on review sites, it's inconceivable that none of NVIDIA's techs read those sites, or that all of them missed the errors. If NVIDIA had corrected the specs in a reasonable timeframe I'd have no issues with them, mistakes do happen, but if not for the fact that AMD has burned me too many times in the past I would have returned my 970 over this.'

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Daviclond posted:

I disagree. I'm not looking to give Nvidia an easy pass here but it's completely conceivable that in a large, highly technical and multi-discipline working environment, two separate groups (marketing guys and engineer guys) hosed up in communicating technical specifications and a fairly obscure, esoteric piece of information was missed out. It's not the job of Engineer Guys to go on websites in the months following the product release and check every marketing chart. Checking should have happened before the information was released to marketing (though I don't know how strict their checking/approval QA processes would be - these things get very lax in the absence of audits even if they do have procedures. We can expect they're much stricter now :v:) but even if it was, "4 GB VRAM, xxx-bit bus width" is still superficially correct and I can see how it might not be caught as an error.

Nvidia deserve a bit of suffering for this gently caress-up, but my interpretation of much of the rage I've read is that it is artificial anger and stems from the notion that making enough noise about the affair would force Nvidia to offer free upgrades to 980s :psyduck:

I think you're being too easy on them. The incorrect specs stood for five months, until NVIDIA was called to task for a separate issue. Neither the bus width nor the number of ROPs were true, even in a technical sense. Even the "fast" 3.5GB runs on a shorter bus than the originally posted specs.

The 3.5 + 0.5 issue was actually separate and not just a simple case of mistaken specs being posted; as far as we know they never intended to tell anyone about it. When I pay $350 for the second best card in a lineup I expect no funny business or fudged numbers. If this were an entry level card or something, sure, hide the fact that the memory is partitioned in some fine print and no one will notice or care, but not for your second highest tier card, and definitely not without telling anyone.

This is a case where if NVIDIA had been upfront about it I would have still bought it. I only game at 1920x1200, so the fact that the VRAM bandwidth gets cut in half at >3.5GB (due to the way it accesses it, it causes even the "fast" 3.5GB to slow down significantly when forced to use the slow 0.5GB) won't affect me for the foreseeable future. However if I were still willing to deal with AMD I may very well have returned my card for a 290 or 290x.

Desuwa fucked around with this message at 10:54 on Feb 24, 2015

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
They're going to drag their heels as long as they can. Sucks for me since I was looking at getting a 4K adaptive sync monitor next year and I'll probably be stuck paying a premium for G-SYNC. G-SYNC feels a bit like NVIDIA's Mantle.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Zephro posted:

Thanks for the G-sync advice. I guess that means I'm holding out for the 300 series to be good, since screw paying more for a monitor feature I can get for nothing. I just hope they aren't as hot and power hungry as the current generation of Radeons.

From what I understand they're even hotter and more power hungry, at least the one (the 390x?) where their reference design includes a small CLC. Unless that wasn't real, it is pretty hard to swallow.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

SwissArmyDruid posted:

Again, I keep saying that if they wouldn't block off the backplate so drat much with all those loving DVI ports, they could have a cooler that's perceived as much better because the tonality of the exhaust goes from EEEEEEEEEEEEEEEE to whoooooooooooooosh.

Here's hoping that Freesync needing DisplayPort means that DVI ports die on AMD products, period.

DVI finally dying will give me no end of headaches. DP, and even HDMI, are extremely annoying to deal with with multiple monitors since they treat each monitor as a PnP device and detect when it's been turned off. There still isn't a panacea solution for stopping this behaviour in any OS on consumer level cards. With HDMI you can just cover one of the pins with tape, or get a cheap cable and pull it out, but that doesn't seem to work with DP.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

GokieKS posted:

If your monitor allows you to disable DDC/CI (Dell UltraSharps do, for example), you can use that to stop it from being a PnP device. But yes, the way DP is handled is quite annoying and I would still prefer DVI given the option.

I have a Dell U2410 in front of me. With DDC/CI disabled Ubuntu (my work machine) still sees it power off enough to throw a fit. Not sure about my home machine with Windows, though. I might give it a try, I have the same monitors at home and I think I have a DP cable hanging around. I'm planning on going with a 4K monitor next time I upgrade my machine (probably when Pascal hits) and that'll necessitate DP.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
It's not the nh-u14s, that's for sure. That's a very capable cooler. Those are the kind of temps I'd expect to see if you didn't use thermal paste at all. Definitely not normal. 1.4v is scary, something weird is going on. First thing I guess is confirm that those readings are accurate by using another program because, honestly, it sounds like the sensors are being misinterpreted.

Then I'd move on to remounting the CPU cooler along with a fresh application of thermal paste.

Beautiful Ninja posted:

I'm not super knowledgeable on CPU cooling, but don't you really need at least 1 intake and 1 exhaust fan for proper cooling? Otherwise you're either never getting cool air or never getting rid of any hot air, either solution being your CPU stewing in its own juices. But that's outside the scope of this thread I think.

He's talking about putting two fans on the CPU cooler itself for push/pull, not the case fans. Otherwise I guess if he were trying to run the cooler as a passive cooler that might explain those temps.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Panty Saluter posted:

Will do.

Is Noctua's paste garbage? That's what I used last time.

No, Noctua's paste is good. There's very little meaningful difference between pastes. Noctua's comes in a huge tube though, so don't use too much.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

All but the worst (and the very best exotic solution) fit into the same 3.5-5 degree range, depending on whether it's CPU or GPU. I guess I should have said there's very little meaningful difference between pastes that aren't cheap poo poo, but eh.

The CPU graph was what I expected but I didn't expect differences for GPUs. I guess there's something significantly different about the characteristics of how GPUs heat up. But people who put aftermarket coolers on their GPUs are a much smaller group than the people who put aftermarket coolers on their CPUs so the results that are specific to GPUs are of limited use.

Desuwa fucked around with this message at 11:04 on Mar 9, 2015

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
Pre-spreading the thermal paste misses the point of the paste. The paste isn't there to coat the CPU die, it's there to fill in gaps between the CPU lid and the heatsink. If you pre-spread it you're going to be putting more paste than is needed in some places and too little in others. Unlike when you put one drop in the middle, pre-spread thermal paste will seal in air bubbles. When you put the paste in the middle and rely on the heatsink to force it away from the center the paste is all displacing air in the same direction (away from the center) so no air gets trapped.

Maybe it doesn't result in a huge, noticeable difference in temperatures most of the time, but there's no reason to pre-spread thermal paste when it takes more effort than just putting some in the middle and it can only make the contact worse, not better.


If nothing you can do results in good temperatures a possibility is that your particular CPU just isn't making good contact with its lid. Delidding is not something you should be doing if you're not an expert so you might just have to live with the temps you have.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Ragingsheep posted:

How are they going to address the memory segmentation issue on the 970 if you stick 8gb of RAM onto it when the current hardware isn't designed to accommodate a full 4gb?

It's less that it can't do 4GB as that the last 8th is slow (and also slows down the first 7/8ths when you try to access it). So an 8GB 970 would be 7+1GB.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

1gnoirents posted:

:3: sorry, sometimes DSR chat gets me all worked up even though I love SSAA. Ever since its been out I've read or heard "I have 4k" countless times. Yes 1440p is fairly rough but I'm not sure its comparable to 1440p DSR, I believe that would be a harder hit ... but I don't know.

If anything true 1440p should be a (probably insignificantly) lighter hit than 1440p DSR; DSR is just a really naive, but competent, implementation of SSAA. Without knowing the inner workings of DSR, it certainly acts like it creates a virtual monitor and then downscales images from that virtual monitor to display them on the real monitor. So it would have to internally render everything as if it were going to end up on a 1440p monitor, then an extra step at the end to downscale it.

Unless NVIDIA is taking shortcuts with DSR, which would be more difficult and somewhat defeat the purpose of it. If they were willing to put real effort into DSR they'd have been better off doing SSAA properly in the first place with a random or at least rotated grid. DSR seems like it was a quick and (relatively) easy feature that was added to give Maxwell something to do besides sit idle when playing older dx9 games.

Desuwa fucked around with this message at 03:02 on Mar 18, 2015

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Kazinsal posted:

Nope. I spent the last two years with Crossfire 5850s and never once had a single game run shittier with CrossFire than without it nor have I ever had major driver issues.

Some people are just so secure in their mindset that ATI/AMD drivers are poo poo circa eight years ago that they'll always be poo poo even if they were just rebranded Nvidia drivers from some strange alternate universe where AMD's just an aftermarket Nvidia card manufacturer.

I had a 7850 and its replacement both completely fail at even the most basic tasks with the newest (and second newest) drivers installed. Without the drivers, using whatever basic VGA drivers Windows loads, it was fine on the desktop. But install AMD's poo poo and both cards would start tearing and having horrible screen corruption in 2D mode. So much as scrolling in a web browser would cause the top quarter or third of my screen to turn into a barcode of random black and white stripes. It sounds like an issue with the card, but both the card and its replacement were fine without the drivers and both exhibited the same symptoms with them.

This is after AMD's poor long term support broke the handling of my 4870, which caused it to rapidly switch between 2D and 3D clocks until I started forcing it manually one way or the other. Or that their default fan curve lets the cards hit 100+C so that thermal protection kicks in, this causes the fan to jump between something like 20 and 100 percent. You could almost blame Sapphire for that but other manufacturer's cards had the same issue.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Cojawfee posted:

Is there a newer ACX 2.0? I ordered my card last year (not in the first batch) and ended up waiting about a month for Amazon to get them in. It has ACX 2.0 and has been running great. Is that the newer heatsink or is there a newer "2.0" which is entirely confusing.

Yes, there are multiple revisions that are called ACX 2.0. Even the newest ones are consistently 6 or 7 degrees hotter than MSI or ASUS designs. If you got it last year it has the really bad cooler and is running a good ten degrees hotter than an MSI card.

It's not bad enough that they're unusable of you have brand loyalty to EVGA, but don't go recommending them to anyone.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Grim Up North posted:

Nah, that's 100% NVIDIA being poo poo at drivers.

Seconding this, Firefox uses GPU acceleration for everything and it's unable to recover from driver crashes.

I can still see disabling FF alleviating problems in games though.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Truga posted:

Haven't you heard? AMD is the only company that releases poo poo drivers.

NVIDIA drivers have been responsible for one of three crashes on this machine that I've been running since 2012. Running Eclipse for a week without killing it for a class caused a hard lock and the third was a random BSOD last week out of nowhere.

Shadowplay never loving works when I want it to, ever. It always works for a few hours or days after rebooting but some background service will eventually crash and I can never be assed to restart just to fix it. There were a few months where the drivers would stop responding and but Windows would be able to recover (except for the one BSOD), even if it hosed whatever I was doing and made me restart Firefox.


After the 5 previous years of AMD I count all of that as an improvement. AMD's drivers were less stable and were utter poo poo at the basic day to day tasks required to keep a card running without micromanaging it. After a driver update my 4870 started rapidly switching between 2D and 3D clocks causing artifacts in games and videos, the fan curves stopped working reliably causing it to sit at 20% speed and rev up to 100% when it hit 105C, and artifacts were introduced in some of the games I played during the last year I used it. I fixed two of those problems by setting profiles manually but that's just stupid.

I returned my 7850 when installing drivers for it caused tearing and artifacts in 2D mode on the desktop.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
I think AMD had some advantages in image quality even going back only a few years ago. I think AMD beat Nvidia with angle-independent AF by a year or something.

Minor, tiny things.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Agreed posted:

It wasn't that long ago that image quality mattered, like the generation before one of 'em (brain's telling me AMD) ripped off the other one's AF algorithm more or less entirely and it's been pretty neck and neck since then. Both of them have solid AA tech, no real difference there anymore apart from the details.

Nobody seems to care about AA anymore, though, it must all be good enough now that resolutions are getting higher for enthusiasts at least and perf is better carved out of other stuff.

I'm really out of the loop on all this crap now, I still like my 780Ti and I haven't even played the witcher 3 since it came out and my preorder turned into a game. I have however beat Crimsonland and also Red Nation on my laptop, which has a GTX 540m in it and overheats regularly. The point is, buy AMD

Baseline AA is cheap and getting fancy isn't worth the cost at higher resolutions, especially now that monitors with higher DPI are coming out that really diminish the effects of aliasing.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

xthetenth posted:

Generally tending towards higher memory capacity for a wide bus and not leaving old generations to rot as far as drivers go are at the heart of things, but in the long term, the past few generations of AMD cards have been fantastic in the long run because of it.

I got the opposite impression, at least with my 4870 which is the single card I kept using the longest.

Two years out and every new driver would break something, but I'd still need to upgrade for certain games. One driver introduced some minor rendering glitches into BF2, and those were never fixed but they were minor enough I didn't downgrade. At some point it started rapidly cycling between 2D and 3D clocks which would cause artifacts in video and some games that weren't strenuous enough, I had to manually set profiles from then on.

The physical cards themselves, ignoring drivers, do seem fine though, even in the long run.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Don Lapre posted:

If you are doing SLI or a really really compact case with poor airflow get reference cooler. If not get an aftermarket cooler.

That being said, it will still run faster than the stock speed and be able to overclock, just not as much.

I'm planning on doing an SLI build next year with Pascal. I know you want reference coolers if the cards are adjacent, but is it better to use two after market coolers with an open slot or two between the cards, or mix one reference and one after market?

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
The rule of thumb for PSUs is five years or their warranty, whichever is longer. That's assuming good PSU brands too, not cheap crap.

If you're just looking at buying a new GPU now and building a new PC and carrying that GPU forward next year it'll probably be fine. If you're going to be trying to breathe life into that machine for another 2-3 years I'd be really hesitant and you'd probably want to see if it's possible for you to just build a new PC with how old those components are.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

bustercasey posted:

Well, I should've held out for the msi it seems. When I'm upgrading out of this setup, I'll be sure to check here and get some real feedback before purchasing. Thanks for your input.

I did the exact same thing in January this year. I eventually returned it in favour of the MSI card, but only after EVGA repeatedly dropped the PR ball and made me mad enough go through with it.

The card itself is going to be fine, it'll run about ten degrees (Celsius) hotter and be a tiny bit louder but it's still a cool and quiet card compared with previous generations.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Paul MaudDib posted:

Also, AMD has a history of ageing fairly gracefully.

Tell that to my 4870 where every driver upgrade after a year and a half meant another thing broke that I had to manually work around because AMD just didn't care.

After a while it was rapidly switching between 2D and 3D clocks during video playback or less intensive games (sometimes even just sitting on the desktop) causing awful visible artifacts so I had to do the driver's job for it and tell it which clocks to use at which time.

The hardware itself was fine and the 1GB of VRAM lasted it a long time, but AMD cards get dementia fast as they age, even if they can still sprint.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Paul MaudDib posted:

How do you know it wasn't just some RAM or NVRAM that was failing in the pages that managed those things on-card? The failure modes are indistinguishable - the harder you run the card, the faster it's clocked and the more likely the page is to flop back to its sticky state. If you reset it, it would work until the failure mode occurred again. It's really difficult to dissociate those kind of hardware failure modes from common driver modes back when everything was hardcoded instead of being a general-purpose processor where everything was just dynamically allocated out of registers/cache/general memory. That transition didn't fully complete until GCN 1.0 (2012).

The issues sprung up immediately after upgrading drivers and went away after downgrading? At least when I bothered to downgrade. I don't just sit on issues when they happen, I try to figure out why they're happening and resolve or work around them, and the why was drivers.

One such issue was trees in the distance in Battlefield 2 no longer properly meshing with the ground after a driver patch - there would be a one pixel ring around the base of the tree where you could see through the ground. I never got a workaround for that and they just stayed that way. This was over two years after the 4870 was new, and it was still ostensibly supported by AMD, they just no longer cared if they broke things.

I gave that PC to a friend and last I heard the card still works just fine, though I can't imagine the HDD and PSU have that much life left in them based on how much I ran them.


Maybe driver aging is less of an issue after the switch to GCN but after being burned by drivers on two different 7850s (ruling out hardware issues) I just gave up on AMD. The problem with my 7850s was that installing AMD's drivers would cause tearing while scrolling in 2D mode. Like on the desktop, scrolling around in windows explorer in a fresh install of Windows 7. I thought the first instance was a hardware issue that wasn't triggered by the standard Windows VGA drivers but when it happened on the second card, even after downloading the drivers again and reinstalling, that was it.

Not sure what the problem was because that's an issue even AMD's driver QA team wouldn't let through, my hardware wasn't weird or anything, and it didn't seem to be a widespread issue, but it happened on two completely different cards.

e: Also three years later that new PC with the 7850s is still going strong, now with a 970 after Fedex killed my 670. In those three years I've had maybe four crashes/BSODs that weren't triggered by something obvious and apparent, like running Eclipse for a week or World of Tanks deciding it wanted to crash and bring the rest of the system down with it. So there don't seem to be any issues with the rest of my hardware or software.

Desuwa fucked around with this message at 08:53 on Nov 27, 2015

Adbot
ADBOT LOVES YOU

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

HalloKitty posted:

That's very odd, as the Windows 7 desktop runs with V-Sync enabled. Are you sure you didn't disable Aero?

I'm very certain. It started when I was installing drivers for a completely fresh install of Windows 7 in a newly built PC. I assumed it was a hardware issue after I reinstalled the drivers the first time, but it happened to the replacement card too.

And actually it wasn't tearing so much as horrible artifacting when scrolling or moving windows. It wasn't real tearing.

Desuwa fucked around with this message at 12:08 on Nov 27, 2015

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply