Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

PC LOAD LETTER posted:

I've watched a number of sites for rumors/info. over the years, some of the best ones are gone now (ie. Aces *sniff*) unfortunately. RWT and B3D are my main go to for GPU/CPU info/leaks. Interesting stuff still occasionally pops up from time to time on some investor boards I look at, usenet (still!), TechReport, Anandtech, WCFtech, vrzone, and [H]. Its nothing like the "old" days (ie. 2003 and prior), people are fairly tight lipped now that NDA's are being enforced seriously. Hawaii originally designed for 20nm makes a lot of sense to me too. AMD must've had a 28nm version in the works the whole time though as a "Plan B" if it looked like TSMC couldn't deliver on time. A 28nm "Plan B" Hawaii designed to run hot is believable and the HSF situation would make sense if they hadn't seriously planned to ship a 28nm Hawaii until very recently. Perhaps they wanted to get something out for the holidays and weren't sure if a redesigned HSF would make it in time too?

That is all guess work though.

While their reference cooler could've been lots better I think the throttling issue is over blown too. There are lots of people who actually own the card, and some reviewers, who've reported little or no issues with throttling with the default fan cap on the 290 or uber mode on the 290X. Most of the complaints about the heat/power/throttling issue seems to be coming from people who don't even own the card.

The hell of it is, NDAs are being so well enforced in no small part because all the tech "journalism" sites are pretty well bought and paid for at this point - who has banged the "Hawaii is hot and loud and runs WRONG! Film at eleven!" drum more than Tomshardware? Oh, yeah, they pay lip service to "well here's a fix BUT.." - it's clear they've got their agenda, just as everyone else at this point has theirs (I was really disappointed with Anand's coverage of the nVidia conference a bit back, they have a whole AMD portal on their site and that barely got a mention compared to live-blogged updates on Volcanic Islands' launch).

The "this was supposed to be on the 20nm process, and TSMC kinda hosed us!" explains both the surprising abundance of 290s that will unlock to 290X cards, as well as the problem of not having a reference cooler better than the one used for the 7970GHz in any way other than cosmetic. I don't think anyone buys the "people are used to cards running cool, but really they should run hot! As hot as possible!" bullshit, that makes no sense electrically and is just unreasonable on its face. Transistors do not work that way and AMD engineers know that as well as anybody, that smells like marketing if anything ever has in the history of the industry. But it's not like they can come out and say "well, here's the deal guys, we are in a bit of a bind and had to do SOMETHING so here's what we came up with, it's a bit last-minute but, you know, these things happen, sorry our cooler is so mediocre." It's phony, but it's necessary, I guess.

nVidia is being hilariously prickish about it, too, but that's just as necessary from their end - kick 'em when they're up, kick 'em when they're down, and in this case they're facing the troubling spot of having cards that are a bit too expensive compared to the competition for high end performance, so they've gotta spin it somehow.

Adbot
ADBOT LOVES YOU

KillHour
Oct 28, 2007


I spent some time looking at new benchmarks/demos, and stumbled across this.

http://www.pouet.net/prod.php?which=61211

:drat:

Does anyone have a system that can run this at 1080@30? My 670 gets as low as 4-5 FPS in some parts.

Doyzer
Jul 28, 2013

Endymion FRS MK1 posted:

Today, I learned that Gamestop sells GPUs :psyduck:

They have added 760 and 770 from evga since I last looked. Decently priced too (only about $20 more than other vendors).

JerikTelorian
Jan 19, 2007



Bloody Hedgehog posted:

If you're upgrading for AC4, then no, there wouldn't be much benefit. AC4 is horribly optimized, and even top-end video cards and dual-card setups aren't able to really brute-force through the slowdown in certain areas of the game. The foliage in that game is a killer. And ARMA..... well, ARMA is it's own weird beast. More powerful cards can help, but then you just get weird matches where performance goes in the toilet for no apparent reason.

Thanks. I just did some testing with ARMA and it is running a perfect 60, so it must have been some scripts or something.

I appreciate the info on AC4; the foliage thing makes sense since things are much better at sea.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

KillHour posted:

I spent some time looking at new benchmarks/demos, and stumbled across this.

http://www.pouet.net/prod.php?which=61211

:drat:

Does anyone have a system that can run this at 1080@30? My 670 gets as low as 4-5 FPS in some parts.

This is awesome. Ran smoothly for me except for one part when a thing got gelled and blew up, probably around 10-15fps.

Everyone should definitely watch this, though, hot drat what a demo. Raytracing For Real, Maya-quality in realtime. Woah.

GrizzlyCow
May 30, 2011

JerikTelorian posted:

Thanks. I just did some testing with ARMA and it is running a perfect 60, so it must have been some scripts or something.

I appreciate the info on AC4; the foliage thing makes sense since things are much better at sea.

Looks like you already got your problem sorted out, but [H]ardOCP did a review thing of ARMA III. They found some settings were CPU-bound and stuff. If you have any more problems getting a constant 60FPS, you can take a look at that article.

PC LOAD LETTER
May 23, 2005
WTF?!

Agreed posted:

I don't think anyone buys the "people are used to cards running cool, but really they should run hot! As hot as possible!" bullshit...nVidia is being hilariously prickish about it...kick 'em when they're up, kick 'em when they're down
I'd read their comments as "if you reduce temp you gotta reduce voltage and clock speed which hurts performance". I could be totally misunderstanding them though. Or maybe it was their PR sucking as usual. Either which way they gave nvidia a easy opening to lob PR bombs and just confused a lot of people.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

PC LOAD LETTER posted:

I'd read their comments as "if you reduce temp you gotta reduce voltage and clock speed which hurts performance". I could be totally misunderstanding them though. Or maybe it was their PR sucking as usual. Either which way they gave nvidia a easy opening to lob PR bombs and just confused a lot of people.

I think the main thing is just that it's so blatantly not what anyone WANTS to say. It's definitely electrically true that as you increase performance, all other things being equal, temperatures and power usage go (sometimes waaay) up. So high performance, high power draw. But when you're AMD, right now, saying that, you're trying to excuse the lack of a proper cooler and that is, as you say, just one heck of a weak spot for anyone with an agenda to poke poke poke. The intuitive notion is that we want things to be powerful and run cool - what people might not grasp (or might, I don't like underestimating folks until I see comments threads on tech sites, haha) is that really there's a whole system at work in terms of engineering for a given TDP and thermal envelope and that the behavior of that system is further complicated by some really swanky firmware-level and software-level thermal controls, from both manufacturers. nVidia just happens to have pretty much The Best Reference Cooler Ever right now, which is really opportunely timed given that AMD is stuck with a bad mismatch between cooler efficacy and their chip's operational requirements.

Aftermarket coolers can't come soon enough - the one good thing about their current coolers sucking is that when the non-DIY aftermarket coolers do hit, it'll give the whole line a nice second wind I think, and if not as well timed as it could have been, still in time for December holiday shopping.

PC LOAD LETTER
May 23, 2005
WTF?!
Yeah a better HSF addresses all the complaints quite nicely. What I wonder about now is how AMD is going to deal with Maxwell assuming nvidia does a have a 28nm version out by March or so 2014.

Unless they do a new amazing HSF a up clocked 290/X is probably out of the cards. A $300-350 R9 290 or $400-450 290X would be a pretty nice option though!:pray:

Or nvidia could just decide to compete on brand again and sell a 880GTX, or whatever they'll call it, for $rape$ and AMD might just decide that their prices/performance are fine where they currently sit thank-you-very-much. That would be...boring. :(

PC LOAD LETTER fucked around with this message at 05:26 on Nov 22, 2013

KillHour
Oct 28, 2007


Agreed posted:

This is awesome. Ran smoothly for me except for one part when a thing got gelled and blew up, probably around 10-15fps.

Everyone should definitely watch this, though, hot drat what a demo. Raytracing For Real, Maya-quality in realtime. Woah.

Turns out my GTX 670 overclocks like a beast.

Doesn't hold a candle to your setup, but I can pull down some pretty respectable performance if I'm willing to let my GPU go into "leafblower mode".



This is the absolute lowest performing spot for me. If you had shown me that screenshot a few years ago, I would have called you a liar.

PC LOAD LETTER
May 23, 2005
WTF?!

KillHour posted:

If you had shown me that screenshot a year ago, I would have called you a liar.
Fixed cuz' you were right the 1st time.:colbert:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I hope they do take advantage of some of their performance headroom when real overclocking becomes a thing, because that hot running chip (due to mediocre cooling) turns into a much nicer one when properly cooled, as veedub has pointed out several times :) And the hardware itself is still fantastically competitive - I've overclocked my GTX 780Ti quite heavily to get my results, but an R9 290x at stock should be putting up good numbers - if I'm not mistaken, that's what's reported by Ghostpilot, who has a mildly overclocked standard 290, right? (Correct me if I'm wrong GP!)

Ghostpilot posted:

The highest temp I've hit on my GPU after some intense gaming was 65c at 1 Ghz (stock is 947). So I may try to see how far I can push things, although I've always been more reluctant to test a GPU's limits.

Oh by the way:

60.6 GPixels/sec, 151.5 GTexels/sec, 320.0 GB/sec memory bandwidth at 1 Ghz.

That's an excellent geometry figure and plenty of bandwidth; nVidia leans heavy on texels and benches tend to favor unfairly in my opinion. The fact that I'm actually pushing 1280 (!!!!!) MHz on my GTX 780Ti now and I've got less geometry throughput but massively higher texel throughput at a memory bandwidth difference that is virtually academic (54.5 GPixels/sec, 272.6 GTexels/sec, 355.2 GB/sec memory bandwidth) means AMD already has some pretty crazy price:performance figures. Games that stress things that somehow rely on that goofy high texel throughput figure more than geometry are obviously going to push substantially in nVidia's direction, but there's a reason that all the 780 family cards lose their edge as resolution rises, even when clocked out the rear end. If he's got another 150-200MHz that he can get out of that card, he'll outrun my goddamned expensive card handily in some applications and catch it pretty well in others, its texel throughput is frankly a bit overkill. I mean it's great for playing Crysis 2 where there are oceans under geometry ne'er to be seen, but any well optimized game just isn't going to need that level of tessellated detail, it isn't feasible for many actors at high resolutions. Not to mention the VRAM difference.

AMD has come correct in every way they could except for cooling, and the Frostbite engine folks are riding their licensees harder to integrate Mantle than Epic is riding theirs to integrate PhysX, so... I dunno, I see cool things coming from both sides. One thing is damned certain, we've got unbelievably powerful hardware already available and time will continue to bring even more powerful hardware to us.

This card has more than five times (maybe more than six times) as much GTexel throughput compared to the card that I built into this system, a GTX 580. That is a lot of progress in effectively 1.5 generations and a little over two years.

Agreed fucked around with this message at 05:44 on Nov 22, 2013

KillHour
Oct 28, 2007


PC LOAD LETTER posted:

Fixed cuz' you were right the 1st time.:colbert:

Agreed. Chromatic aberration, barrel distortion, Bokeh, reflections, and refractions all in real time at 1080P. :psyduck:

If it was anti-aliased, I would have guessed a 2-3 hour render per frame.

KillHour fucked around with this message at 05:54 on Nov 22, 2013

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Agreed posted:

I hope they do take advantage of some of their performance headroom when real overclocking becomes a thing, because that hot running chip (due to mediocre cooling) turns into a much nicer one when properly cooled, as veedub has pointed out several times :) And the hardware itself is still fantastically competitive - I've overclocked my GTX 780Ti quite heavily to get my results, but an R9 290x at stock should be putting up good numbers - if I'm not mistaken, that's what's reported by Ghostpilot, who has a mildly overclocked standard 290, right? (Correct me if I'm wrong GP!)

Nope, you're correct! I figured I would go with a very gentle overclock for now, basically matching the 290x's numbers (shaders aside, of course) at stock. The highest temp I've hit on the GPU was 63c. Curiously, VRM temps only show up in HWInfo64, which is pretty taxing to have open alongside anything that would properly tax the system. I'd really like to know why that's the cse.

As an aside, I hadn't typically paid as much attention to my GPU monitoring in the past, but it's really surprising how taxing even a Youtube video can be on a video card.


Agreed posted:

That's an excellent geometry figure and plenty of bandwidth; nVidia leans heavy on texels and benches tend to favor unfairly in my opinion. The fact that I'm actually pushing 1280 (!!!!!) MHz on my GTX 780Ti now and I've got less geometry throughput but massively higher texel throughput at a memory bandwidth difference that is virtually academic (54.5 GPixels/sec, 272.6 GTexels/sec, 355.2 GB/sec memory bandwidth) means AMD already has some pretty crazy price:performance figures. Games that stress things that somehow rely on that goofy high texel throughput figure more than geometry are obviously going to push substantially in nVidia's direction, but there's a reason that all the 780 family cards lose their edge as resolution rises, even when clocked out the rear end. If he's got another 150-200MHz that he can get out of that card, he'll outrun my goddamned expensive card handily in some applications and catch it pretty well in others, its texel throughput is frankly a bit overkill. I mean it's great for playing Crysis 2 where there are oceans under geometry ne'er to be seen, but any well optimized game just isn't going to need that level of tessellated detail, it isn't feasible for many actors at high resolutions. Not to mention the VRAM difference.

AMD has come correct in every way they could except for cooling, and the Frostbite engine folks are riding their licensees harder to integrate Mantle than Epic is riding theirs to integrate PhysX, so... I dunno, I see cool things coming from both sides. One thing is damned certain, we've got unbelievably powerful hardware already available and time will continue to bring even more powerful hardware to us.

This card has more than five times (maybe more than six times) as much GTexel throughput compared to the card that I built into this system, a GTX 580. That is a lot of progress in effectively 1.5 generations and a little over two years.

Yeah, the numbers are closer than I thought they would be (texel's aside)! Though I had to admit that you have quite a bit more peace of mine than I do at the moment. Not that I've had any issues aside from the black screen from leaving the system on overnight, but I can't shake this feeling of another shoe dropping. Hopefully it'll pass over the next few days.

Nevertheless it's nice to have a juicy video card for the first time in...goodness, 5 years.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Ghostpilot posted:

Nope, you're correct! I figured I would go with a very gentle overclock for now, basically matching the 290x's numbers (shaders aside, of course) at stock. The highest temp I've hit on the GPU was 63c. Curiously, VRM temps only show up in HWInfo64, which is pretty taxing to have open alongside anything that would properly tax the system. I'd really like to know why that's the cse.

As an aside, I hadn't typically paid as much attention to my GPU monitoring in the past, but it's really surprising how taxing even a Youtube video can be on a video card.


Yeah, the numbers are closer than I thought they would be (texel's aside)! Though I had to admit that you have quite a bit more peace of mine than I do at the moment. Not that I've had any issues aside from the black screen from leaving the system on overnight, but I can't shake this feeling of another shoe dropping. Hopefully it'll pass over the next few days.

Nevertheless it's nice to have a juicy video card for the first time in...goodness, 5 years.

Plus, your The High End was barely more than a 770 costs, and it just whomps that card for performance :) I hope they didn't end up shipping out a batch of crap units, I'm also a little unsure how a driver patch/hack could address a physical memory issue... It seems like AMD was in a lot of ways kind of caught in a do-or-die situation here, I am coming more and more to be convinced they had intended to launch this architecture on the 20nm process and had to figure out an alternative quickly. It's just a very tidy explanation for a number of things that are otherwise very :raise: behaviors.

I admit that I do enjoy the peace-of-mind that an nVidia purchase affords me. While their drivers have had some serious issues over the last few months as they transitioned from the 316 branch to the 322s and onward, they seem to have settled into a comfortable spot with great performance as of the last WHQL for all modern products and they're even working on isolating whatever the hell has been going on with the 560Ti card issues. AMD definitely has a moneybags disadvantage there, I don't think they even officially support the 5000-series cards anymore, do they? They seem to have to focus pretty heavily on whatever they're currently working with, especially after the FCAT release from nVidia exposed their frame pacing issues to the world. Triage for the older cards and less commonly used modes sucks... But there's only so much they can realistically do, and I would be a prick if I didn't recognize that they are offering a pretty outstanding value prospect, a few issues aside. For $80 more than this 780Ti cost me, you can get two R9 290s if they're not being overcharged at retail, and those numbers we were comparing obviously tilt very heavily in AMD's favor at that point. They are the clear choice for very high resolution gaming, in my opinion, apart from offering the best raw price:performance before the current air overclocking situation (or lack thereof) is taken into account.

It is exciting. :3: And I bet going from a mediocre performer to a close to top of the line card after five years is awesome. I kind wish I could have that experience but the chain of cards for me has been GTX 280 which held 1280x1024 until I upgraded my monitor and my graphics card at the same time (and built a new comp) in June of 2011, went with a GTX 580... And then a year later, a GTX 680... And then a year later, a GTX 780... And now a few months later :suicide: a GTX 780Ti. My wife is amazingly tolerant of my shiny poo poo addiction provided I make room for the new by moving the old (and occasionally toss in something extra from my general pile o' stuff - after all, business before pleasure, but once things are taken care of she doesn't mind indulging even when it's, as FF put it, basically turning me into the Homer Simpson of graphics card buyers at this point). Maybe I should wait until Volta just to get that exciting feeling of HOLY poo poo THIS IS AMAZINGLY HIGHER PERFORMANCE instead of "woohoo somewhere between 30% and 50% additional performance, moderate FPS gains here I come!" for a large outlay, comparatively speaking.

But that is, realistically, almost certainly not gonna happen :smith:

Agreed fucked around with this message at 06:53 on Nov 22, 2013

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy
I'd just like to chime in with hopping the r9 290x turns out to have more legs left in it then what the stock cooler and the stock power delivery system can give it

Before i returned my 290x for coil whine (seems to be a big problem with the reference cards atm) I overclocked it to

Core: 1160mhz
Memory clock 1,500MHz
added voltage 100mv
http://www.3dmark.com/fs/1140385
And that got me a 5541 score on Firestrike Extreme so now I'm waiting for the aftermarket cards to come with money in my wallet whispering to me to spent it :(
The thermal claims of being designed to run at 95c seem like a cop out for the totally worthless stock cooler and seeing how lower temps should mean you get lower resistances shouldn't that mean that with better cooling you could get even better overclocking potential if at 94c it will happily run at 1ghz?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

woppy71 posted:

Thanks for the info and links :) That AnandTech site looks pretty interesting, I'll take a closer look at that.

Luckily, noise isn't an issue for me, so the need for a passive cooled option isn't high on my wish list, an actively cooled card would be fine for me. I was looking at this http://www.amazon.co.uk/gp/product/B009X4J8Y8/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=A3P5ROKL5A1OLE as a possible purchase. Is the XFX brand a reputable one?

This isn't really the parts picking thread, but that card should be fine.

PC LOAD LETTER
May 23, 2005
WTF?!

Agreed posted:

I don't think they even officially support the 5000-series cards anymore, do they?
They do, last update for them was in September. Driver updates aren't done often for older GPU's, because according to AMD, most of the performance has been wrung out of their older VLIW based GPU's. Updates are still done to improve stability or bug fixes or to support new OS like Win8. For the 4xxx, 3xxx, and 2xxx cards they have a "legacy" driver which was last updated on 10/13/2013.

Their driver support has actually been pretty good for a while now. For single cards you've got nothing to worry about. Its just very different from how Nvidia does things. They do WHQL's "as necessary" for the office guys and for the :pcgaming: gamerz :pcgaming: they do betas frequently. Usually every month there is a new one. Doing the whole CCleaner/Driver cleaner bit isn't necessary either most of the time. Only if you have issues after installing a new driver. Personally I've just been installing over the old ones for quite a while now. Haven't had a problem.

CF is a different story but even that has improved quite a bit too.

PC LOAD LETTER fucked around with this message at 10:58 on Nov 22, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Michymech posted:

I'd just like to chime in with hopping the r9 290x turns out to have more legs left in it then what the stock cooler and the stock power delivery system can give it

Before i returned my 290x for coil whine (seems to be a big problem with the reference cards atm) I overclocked it to

Core: 1160mhz
Memory clock 1,500MHz
added voltage 100mv
http://www.3dmark.com/fs/1140385
And that got me a 5541 score on Firestrike Extreme so now I'm waiting for the aftermarket cards to come with money in my wallet whispering to me to spent it :(
The thermal claims of being designed to run at 95c seem like a cop out for the totally worthless stock cooler and seeing how lower temps should mean you get lower resistances shouldn't that mean that with better cooling you could get even better overclocking potential if at 94c it will happily run at 1ghz?

I just reinstalled 3Dmark (had uninstalled it to switch to the Steam version for less hassle, hah) so I could see how it does in extreme - at my OC, I'm hitting 5680. Not sure to what extent my CPU having to be downclocked from its old 4.7GHz is hurting me there, it knocked a solid 800 points off my physics score in 3DMark11 :pcgaming: When I turn this expensive box of parts into a Haswell-based computer, I wonder if I'll be able to get a salty enough OC to make that particular metric take off? It'd be pretty neat to push 6000 on extreme.

Man, this is the worst, haha. There are so many better ways to use a graphics card than benchmarks. I'm never going to do a suicide run to try to break records, and I honestly don't find the benchmark results in 3DMark to be real indicators of performance since it's a bit common-denominator and actual game situations are going to respond better to more geometry or more shader power of a certain kind or (etc., the sundry things that make up the architectural differences between nVidia and AMD cards). But still, for a $500 card with ~about the same OC to be ~about the same as a $730 card in the e-peen bench designed to test the system, that's a winning setup, just needs some time to get the heat issue worked out. So, you know, any time MSI... Gigabyte... Powercolor... Anybody, really, don't scrimp on the good parts and make some bombshell Hawaii cards alrighty?

:allears:

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy

Agreed posted:

So, you know, any time MSI... Gigabyte... Powercolor... Anybody, really, don't scrimp on the good parts and make some bombshell Hawaii cards alrighty?

:allears:

I have a 360mm + 240mm rad, a waterblock and a 1kw PSU waiting for a 290x with a really good power delivery system so I can crank the gently caress out of it, common ASUS give me a Matrix version ASAP

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Was thinking about that Game of Pwns poster and then I saw this (blue highlighting mine):



Still nice to see that ASUS is getting some more competition in the SFF market.

vvv I almost thought my card had entered the Bermuda Triangle by now. This afternoon's going to be fun.

Sidesaddle Cavalry fucked around with this message at 18:40 on Nov 22, 2013

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."
In baby steps I've upped my GPU clock to 1100 mhz on stock voltage. No hint of trouble yet with a high of 68c in Furmark. Curiously, Furmark doesn't max out the GPU clocks, hanging around 890-910 Mhz. However my GPU and VRM temps did get quite toasty (68c on the GPU and 88c on the VRM before I killed Furmark). No artifacts or any maladies to report with 37-38c idle with an ambient of 23c.

I was mistaken earlier, by the way: those figures I listed were at the stock clocks.

947 Mhz (Stock):
60.6 GPixels, 151.5 GTexels.

1000 Mhz:
64.0 GPixels, 160 GTexels.

1100 Mhz:
70.4 GPixels, 176 GTexels.

I'm hesitant to push it too much further due to the VRM temps (I read that they're rated for 125c, but even 100c is too toasty for me). Besides, about a 14% increase on stock voltage ain't bad.

Edit: Just got an update on the package, Cavalry. It just hit his city today and is scheduled to be delivered today!

Ghostpilot fucked around with this message at 18:18 on Nov 22, 2013

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Endymion FRS MK1 posted:

Today, I learned that Gamestop sells GPUs :psyduck:

If you happened to buy a dud you'd be hosed. They won't allow returns to their stores and they don't even post a return policy online, they just say "instructions are included on the packing slip".

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai
I guess that article from Tom's about replacing the thermal paste on the 290/290X does have some truth to it.

After putting some new thermal paste (Antec Formula 7) on the GPU and slapping the reference cooler back on on Monday, I've noticed that it takes a good bit longer to reach 95C, and the clocks actually stay at a stable 947 Mhz now. Before, it only took like 50 seconds of gameplay or Furmark to warm it up, and it would constantly jump from 947 Mhz down to something like 906 Mhz and back again.

Now, I just need to wait to get my refund for the Xtreme III, which Newegg should be getting in today. I think I'm going to opt for the Gelid once I do, but tying on an Antec 620 just seems so tempting. Doesn't seem like it's possible with the default RAM/VRM block though (at least from what I'm seeing on OCN), so I'd have to pay extra for some decent RAM and VRM heatsinks.

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Rahu X posted:

I guess that article from Tom's about replacing the thermal paste on the 290/290X does have some truth to it.

After putting some new thermal paste (Antec Formula 7) on the GPU and slapping the reference cooler back on on Monday, I've noticed that it takes a good bit longer to reach 95C, and the clocks actually stay at a stable 947 Mhz now. Before, it only took like 50 seconds of gameplay or Furmark to warm it up, and it would constantly jump from 947 Mhz down to something like 906 Mhz and back again.

Now, I just need to wait to get my refund for the Xtreme III, which Newegg should be getting in today. I think I'm going to opt for the Gelid once I do, but tying on an Antec 620 just seems so tempting. Doesn't seem like it's possible with the default RAM/VRM block though (at least from what I'm seeing on OCN), so I'd have to pay extra for some decent RAM and VRM heatsinks.

I had the same bit with Furmark (it wouldn't even hit stock speeds), despite being well below throttling temps. I really don't know what's going on with that. Heck, I haven't even hit 70c yet. If you end up with the Antec 620, I'm really curious to see how that pans out.

GrizzlyCow
May 30, 2011
AnandTech had similar experience with Furmark. In their R9 290, they commented that the R9 290 bottomed out at 662MHz. I think the Hawaii cards just kind of throttle down during stress tests regardless of temps.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Both AMD and Nvidia have publicly called Furmark a "power virus" and explicitly throttle the poo poo out of it more so than other programs. I guess AMD just got mad at it? And cranked the throttling to eleven.

Parker Lewis
Jan 4, 2006

Can't Lose


Sidesaddle Cavalry posted:

Was thinking about that Game of Pwns poster and then I saw this (blue highlighting mine):



Still nice to see that ASUS is getting some more competition in the SFF market.

vvv I almost thought my card had entered the Bermuda Triangle by now. This afternoon's going to be fun.

I got one of those "Sorry, I'm busy gaming" door hangers with my MSI Z87-G45 motherboard. Hung it on the door to the basement one time, my wife was not amused.

That's really cool that MSI is doing ITX versions of their Z87 board and 760 video card, though. If those had been around earlier this year I might have gone that route instead of doing another ATX build, although from what I remember my ITX case (Fractal Node 304) will actually fit a full-size dual-slot card. The short version would definitely give some more flexibility in terms of power supply choices though.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
If a full-size 760 gets a Twin Frozr cooler, what's the mini get? Just Frozr?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

Both AMD and Nvidia have publicly called Furmark a "power virus" and explicitly throttle the poo poo out of it more so than other programs. I guess AMD just got mad at it? And cranked the throttling to eleven.

Considering the inadequacies of the stock cooler and the fact that even with better cooling the VRM temperature can get up to 100ºC or higher, oh also that Furmark is bloody useless at doing anything but damaging components... It's not really a power virus, at least by my definition as all operations remain under the control of the user and it doesn't run any uarch-specific things to really max out the power, buuuut it's also not a meaningfully useful stress test tool either. Nothing draws power like Furmark. Metro LL doesn't draw power like Furmark. I'm fine with companies keeping their VRMs alive and well, the last thing I want is for a "testing" program to cook my friggin' VRMs.

I've been looking back through nVidia's strategy and man oh man do they love some texture units. GTX 580, card I bought in, whatsit, 2011, trivially easy to overclock to get at least 40GPixels/sec. Similarly, and relevant to you, GTX 680 SC - they straight up took a hit in pixel throughput on purpose, to provide 33.9 GPixel/s from a factory overclock. And I've pushed my 780Ti quite heavily to get it to 54GPixels/sec.

But if you look at their texel throughput, it's amazing (well, okay, with Kepler it's obviously fairly linear, a SMX is a SMX and more of 'em means more texels but here's the numbers :shobon:)

GTX 580 - 53.4 GTexels/sec
GTX 680 SC - 135 GTexels/sec
GTX 780Ti OC - 272.6 GTexels/sec

AMD is absolutely and without question the pure geometry throughput winner - look at Ghostpilot's R9 290, that is some wicked GPixel scaling, from stock 60.6 to overclocked 70.4 - in any situation where just being able to put a shitload of pixels on screen becomes a limiting factor, AMD's architectural decisions here are really well made. But nVidia isn't stupid, either, as evidenced by their attention to game developer trends toward using shader-based and texel-bottlenecked doodads and acting accordingly. Here's an older TechReport article from the launch of the GTX 680 that goes pretty far out of its way to find real performance numbers instead of theoretical throughput. It compares nVidia's GF110 and GK104 vs AMD's Evergreen through Pitcairn/Tahiti lineup. At least with then-current benchmarks (ones we still use today) the lower ROP count and, in general, poorer geometry throughput thanks to the middlin' memory bandwidth of the GTX 680 doesn't really seem to matter all that much even when it goes pixel vs. pixel against the 7970, while everything texel related is MASSIVELY in nVidia's favor.

It seems to me like AMD has the better strategy for actual UHD resolutions -right now-, which was something I figured was intuitively true but more digging around seems to confirm it. nVidia has the better card for, oh, 1600p or so, with everything cranked, at least in games that aren't just better performers on the GCN 1 / 1.1 architecture. I'd imagine Maxwell will be continuing to bring extremely high texel performance but bump up their ROP count and enable a comparatively high level of geometry performance. GK110 gets this generation "for free," as it were, just because it's loving huge and, when fully enabled, has insane memory bandwidth and answers the problems of today's games really, really well. It isn't an architecture that seems like it's going to excel at UHD resolutions or greater, and that's more or less proven in practice, where the aforementioned superb scaling of the R9 290/290x in geometry bound situations really takes off. But then they get a little bit held back by texel performance that is good for last-gen... If by last gen you mean the GTX 680, not, you know, GK110.

Maxwell could do some really nifty things. I wish nVidia would talk more about it. We have, maybe, three confirmed notes about it and none of them are heavy facts, just "it'll have these things!" Not a word one way or another about what it's bringing to the table, I'm guessing because of production issues with TSMC forcing their time table just as much as AMD's. From a marketing perspective I can see how they wouldn't want to steal their own thunder right now after having refreshed the whole lineup from bottom to top finally, the conversation they want to have isn't about what you should wait to buy, it's about what they can sell you right now that's really powerful and good. I get that. And they're going to need something to start banging the hype drum once this moment is over for them (which will probably coincide with the release of AMD's after-market coolers from partners, but come before any real-world Mantle stuff makes its way to players).

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Factory Factory posted:

Both AMD and Nvidia have publicly called Furmark a "power virus" and explicitly throttle the poo poo out of it more so than other programs. I guess AMD just got mad at it? And cranked the throttling to eleven.

That would explain it! Glad I kept an eye on the temps. It was rock-solid during the run though, and after hearing about how much it punishes cards, that goes a long way towards alleviating my fears about the cooling and the card itself.

By the way, Sidesaddle Cavalry, the card has arrived and he's absolutely ecstatic! Going from a 5500 series to that has blown his mind. He had eye surgery to correct his failing vision earlier in the year, and now that his vision is 20/20 for the maybe the first time in his life, he's now finally able to fully enjoy the games he has on high quality. :neckbeard:

Now I believe all that's left is for you to get Agreed's card and we'll be set!

Ghostpilot fucked around with this message at 22:17 on Nov 22, 2013

veedubfreak
Apr 2, 2005

by Smythe
I'll try and get some actual benchmarks done this weekend. I haven't had the "givashit" to actually mess around much with the system once I got it installed. Although I did flash both cards to the Asus 290x bios for extra voltage control. One thing I have noticed is that I can't seem to get the same overclock I had with just a single card. I'm not sure if this is because of crossfire or because of the 290x bios. The memory won't overclock for poo poo that's for sure. 5500 is about as high as I can get it before poo poo starts going cray. Any benchmarks you guys would like to see me run?

Also, it has pretty much been proven at this point that early XFX and Powercolor cards are in fact, true 290x chips that were just sold as 290s with a bios lock. For once it pays off to be an early adopter :)

Also, seeing as next week I have 4 days off with Thanksgiving and my family is 1000 miles away, I think I'm going to put some movies on, drink some beers and tear the entire rig down for a thorough cleaning. Going to turn it into 1 giant loop instead of separate loops since I'm pretty sure I have way more radiator than I need on the cpu loop.

EoRaptor
Sep 13, 2003

by Fluffdaddy

veedubfreak posted:

Also, it has pretty much been proven at this point that early XFX and Powercolor cards are in fact, true 290x chips that were just sold as 290s with a bios lock. For once it pays off to be an early adopter :)

This isn't the first, or the second, or even the third time AMD has done this. nVidia have also done it, but they tend to 'fuse' the chips never to be fully enabled again, regardless of BIOS, where AMD lets it slide.

For benchmarking, I'd say the obvious 3dmark run, but maybe look at recent benchmarks on anandtech or similar, and see what they are using and how? If you replicate those, you know how your system compares and provide something other people can also compare to.

Wistful of Dollars
Aug 25, 2009

Once I have my Internet up and running tomorrow at my new place I'll try unlocking the yet-as-unopened 290 and see what happens.

Then it'll be time to start plotting my wc adventure...

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."
Hmm, noticed during my Guild Wars 2 session that my card was throttling pretty heavily tonight. Despite being under 60c it was throttling to the low 900 mhz. :raise: Yet in ShaderToyMark and such it performs just fine at the same temps. I'm left scratching my head here.

Edit: By the by, some folks may find this thread useful: Is your r9 290 Unlockable? Find out here!

Ghostpilot fucked around with this message at 03:49 on Nov 23, 2013

KillHour
Oct 28, 2007


Found a "making of," if anyone's curious as to how they got the raytracing demo to run so smoothly.

http://directtovideo.wordpress.com/2013/05/07/real-time-ray-tracing/
http://directtovideo.wordpress.com/2013/05/08/real-time-ray-tracing-part-2/

The answer: Lots of application-specific optimizations, and a rasterizer/raytracer hybrid engine.

forbidden dialectics
Jul 26, 2005





Being the sane person that I am, I asked my wife if it was OK if I blew $1100 on computer watercooling parts and crossfire R9 290s.

Being the sane woman that my wife is, she pulled up my Steam library and asked me which game I had spent the most time playing recently.

It was Borderlands 2. At 1920x1200.

"And do you ever get below 60 fps, even with PhysX to high?"

"No..."

"Then let's go on a vacation instead, ok?"

And that was that.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

My wife just has a "coming in? fine, better be stuff going out :keke:" attitude towards my more ridiculous computer-related expenses. But then, apart from the graphics card, my computer is central to how I work. Though lately I'm changing the kind of work I am -able- to do thanks to my useless back, I still want to do things like play the latest cool games at high settings - and I actually do that, I made a lot of room on my 7200RPM HDD today to stick the new (tremendously huge) games on, and will be shuffling some stuff around on my current system's SSDs to make room there, too.

Really need to get the next box of stuff transformed from packages to a finished system, go from less than 200GB of SSD space for games to over 1TB of much faster SSD space, some for apps, some for games, all for speeeeeed.

I dunno, what, I don't mean this harshly or pejoratively or anything, but what actual business of mine is it how things work in your household budget? If you don't want/need a new graphics card, great, likely means you'll get a better price:performance choice later on when you do :) And if your wife insists on a certain way of doing things, that's between y'all. I think my wife's "fun stuff out --> fun stuff in" policy is totally fair and we more or less came up with it together.

I don't think it hurts that I also spontaneously buy her fun stuff without any in/out on her part, haha. And I make sure we're solid financially before I even think of doing something as silly as replacing a 780 with a 780Ti.

Hell of a card, though. God drat is it fast. :allears: Also, if you've heard that Arkham Origins is lame, they were wrong, it's totally a good game. AC4, too, though I can see some serious optimization issues - I would not want to try running any non-sea areas at UHD resolutions without packing a dual-card setup, just nope nuh uh bad.

I half-rear end want to get CoD: Ghosts just so I can have another game with TXAA, plus I like shootmans games and I haven't had enough exposure to them to get burned out on the CoD franchise. Anyone here played it? Worth the price assuming it will actually be played after buying? :v: I liked Black Ops II and I didn't dislike CoD:MWF3, if that helps give some context to my expectations of the franchise. Playing single player.

Edit: Ok nevermind it looks like the single player is basically just a tacked-on tech demo and your squad ends up doing more than you. gently caress that, I'll get it on a steam sale if I ever do. Also, I am amazed that 1. they are charging $50 for the season pass on top of the game, and 2. apparently it's successful enough as a franchise that people are buying it. I guess moving like a billion units on day one means this is kind of a big deal as a game but clearly meant for multiplayer dudes, not singleplayer dudes. And ne'er the twain shall meet.

Agreed fucked around with this message at 09:47 on Nov 23, 2013

Josh Lyman
May 24, 2009


Nostrum posted:

Being the sane person that I am, I asked my wife if it was OK if I blew $1100 on computer watercooling parts and crossfire R9 290s.

Being the sane woman that my wife is, she pulled up my Steam library and asked me which game I had spent the most time playing recently.

It was Borderlands 2. At 1920x1200.

"And do you ever get below 60 fps, even with PhysX to high?"

"No..."

"Then let's go on a vacation instead, ok?"

And that was that.
I'm jealous your wife knew to ask that. :allears:

Adbot
ADBOT LOVES YOU

forbidden dialectics
Jul 26, 2005





Agreed posted:

I dunno, what, I don't mean this harshly or pejoratively or anything, but what actual business of mine is it how things work in your household budget? If you don't want/need a new graphics card, great, likely means you'll get a better price:performance choice later on when you do :) And if your wife insists on a certain way of doing things, that's between y'all. I think my wife's "fun stuff out --> fun stuff in" policy is totally fair and we more or less came up with it together.

It's more like, I get more enjoyment out of overclocking and cooling a computer than I do actually using it to play video games. And my wife puts those kinds of things in perspective for me, such that, yeah, I really would rather have a great week in Las Vegas than have a computer that's 5% faster at loading a Civ 5 map, saving me a grand total of 9 seconds and forcing me to endure millisecond-long dips into the sub-60fps realm. First loving world problems, what a bitch!

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply