Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."
Speaking of this, installing a Gelid on a 290 is an experience I hope to not repeat any time soon. While the site says that it is compatible (and to be fair, the heatsink is) the ramsinks are compatible in the same way you could use a butterknife as a screwdriver. Those are a colossal pain in the rear end to have to deal with, especially the one below the GPU.

Even with the shortest ramsink it interferes with the placement of the heatsink. After taking the advice of this guide and trying to sand it down to no avail, I finally just use one of the regular ramsinks turned at a 90 degree angle to over about 80% of the chip while the GPU heatsink covered the rest (yeah, that's what it's like dealing with this thing).

The thing just incidentally works on the 290/x. Now granted, now that I have it running it does its job well (highest I've hit was 58c with on stock clocks). Knowing what I know now, I could probably cut the time by at least a third, but putting the thermal tape on each of the 24 ramsinks is not a process I'd ever wish to duplicate.

Tomorrow I'm going to try for the unlock and see just what I can manage to get away achieve on that end.

Adbot
ADBOT LOVES YOU

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai

Ghostpilot posted:

Speaking of this, installing a Gelid on a 290 is an experience I hope to not repeat any time soon. While the site says that it is compatible (and to be fair, the heatsink is) the ramsinks are compatible in the same way you could use a butterknife as a screwdriver. Those are a colossal pain in the rear end to have to deal with, especially the one below the GPU.

Even with the shortest ramsink it interferes with the placement of the heatsink. After taking the advice of this guide and trying to sand it down to no avail, I finally just use one of the regular ramsinks turned at a 90 degree angle to over about 80% of the chip while the GPU heatsink covered the rest (yeah, that's what it's like dealing with this thing).

The thing just incidentally works on the 290/x. Now granted, now that I have it running it does its job well (highest I've hit was 58c with on stock clocks). Knowing what I know now, I could probably cut the time by at least a third, but putting the thermal tape on each of the 24 ramsinks is not a process I'd ever wish to duplicate.

Tomorrow I'm going to try for the unlock and see just what I can manage to get away achieve on that end.

Yeah. I hear all kinds of stories about the VRAM below the GPU causing all sorts of troubles. Hell, in Tom's installation guide for the Xtreme III, they had to settle for one of those straight VRM sinks and half of a thermal pad.

Those core temps sound fantastic, but how are the VRM temps under load? How about noise level of the fans?

I'm still debating on getting a Gelid to replace the Xtreme III I had earlier. That, or I might get a cheap AIO and do "The Mod."

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Rahu X posted:

Yeah. I hear all kinds of stories about the VRAM below the GPU causing all sorts of troubles. Hell, in Tom's installation guide for the Xtreme III, they had to settle for one of those straight VRM sinks and half of a thermal pad.

Those core temps sound fantastic, but how are the VRM temps under load? How about noise level of the fans?

I'm still debating on getting a Gelid to replace the Xtreme III I had earlier. That, or I might get a cheap AIO and do "The Mod."

I'd tell you, but for some reason they don't show up in either GPU-Z nor HWInfo (as in I don't see anything under "VDDC Power In").

The fan isn't as quiet as I thought it would be, but it could just be a matter of giving it a little adjustment. It's not a huge deal as I keep the computer in another room and have the wires run through the wall, so the room I have the PC in is silent.

By the way, tomorrow I was going to put my copy of BF4 for sale on SA Mart. Would $40-$50 be a fair price?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Ghostpilot posted:

I'd tell you, but for some reason they don't show up in either GPU-Z nor HWInfo (as in I don't see anything under "VDDC Power In").

The fan isn't as quiet as I thought it would be, but it could just be a matter of giving it a little adjustment. It's not a huge deal as I keep the computer in another room and have the wires run through the wall, so the room I have the PC in is silent.

By the way, tomorrow I was going to put my copy of BF4 for sale on SA Mart. Would $40-$50 be a fair price?

I've seen the bundle games going for around $30-$40, depending... Question, does it have any singleplayer or is it strictly online competitive?

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Agreed posted:

I've seen the bundle games going for around $30-$40, depending... Question, does it have any singleplayer or is it strictly online competitive?

Oh I haven't a clue. I've never followed the series, so I don't know the first thing about it.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Generally in Battlefield games, there is a single player campaign, but it's kinda cruddy and completely lackluster. Not as sheer crap as Medal of Honor, but still cruddy. Multiplayer is the big deal.

It's a terrible game if you wait for games to hit $20 or less before buying, because by the time that Battlefield X is that cheap, Battlefield X+1 is out and everybody is playing that.

Animal
Apr 8, 2003

I'll pay $40 if that includes Thief.

Schpyder
Jun 13, 2002

Attackle Grackle

Factory Factory posted:

Generally in Battlefield games, there is a single player campaign, but it's kinda cruddy and completely lackluster. Not as sheer crap as Medal of Honor, but still cruddy.

This is an entirely accurate description of the BF4 SP campaign.

Don't get it unless you want to play MP.

Freakazoid_
Jul 5, 2013


Buglord

Ghostpilot posted:

Agreed does have a point, I suppose we're all Veedubfreakazoids. :pcgaming:

So that's the guy who keeps sending me all these strange looking computer parts!

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Animal posted:

I'll pay $40 if that includes Thief.

It's just BF4; sorry about that.

Also seems that people may have stumbled upon an easy way of determining if cards will unlock.

Apparently version P.0's (core number ending in 2000) will unlock.



And version 1.1's (core number ending in 202) won't unlock.



Sadly I seem to fall into the latter category here. But at least I didn't have to try flashing to find out. It's not definitive, but so far the results have been consistent. There was a guy in the thread who mentioned one of his 290's unlocking and the other one not, so perhaps he'll be able to post the version numbers of his cards.

How about you, Veedubfreak?

Edit: Also, Veedubfreak, people in the OCN unlock thread have been wondering if the unlocked shaders are perhaps a placebo affect and have mentioned that the best way to test would be versus a genuine 290x. Since you have (had?) both, might it be something you can verify?

Edit 2: Finally got a read on my VRM temps using the beta version of HWiNFO64. The minimum was 34c and the max was 57c.

Edit 3: Seems that the culprit of my Gelid's fan noise is a loose fan mount on the heatsink, which is something that shouldn't be difficult to correct.

Ghostpilot fucked around with this message at 16:11 on Nov 20, 2013

knox_harrington
Feb 18, 2011

Running no point.

I installed my 280x and it seems to be ridiculous overkill for 1080p, it runs Bioshock Infinite at about 150fps and Metro LL at 120. Big improvement over the 7870 xt anyway.

TyrantWD
Nov 6, 2010
Ignore my doomerism, I don't think better things are possible
I know I've complained about this before, but AMD not ensuring that the 290 and 290X non-reference models come out before the biggest shopping event of the year is such terrible management on their part (especially considering that the cards seem designed with alternative cooling options in mind).

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."
I haven't had any issues in games or anywhere else, frankly, but when I browse SA this happens:



It'll go away when I scroll it off the screen or do anything that causes a refresh, but I can't think of why it happens. Any ideas?

An Unoriginal Name
Jul 11, 2011

My favorite touhou is my beloved Nitori.
:swoon:
Could be an add-on/plugin acting weird (if you have any installed), try disabling them all temporarily and see what happens? Or turn off/on hardware acceleration in your browser's settings.

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."
I've disabled the the hardware acceleration and so far, so good. Thanks! :)

Also I fixed the loose fanmount on the Gelid heatsink with a couple of zip ties. It cools wonderfully, but its idiosyncrasies remind me of my first car as a teenager. Hopefully this'll be the last of them.

veedubfreak
Apr 2, 2005

by Smythe
I already sent my 290x back to Amazon. I'll check my numbers when I get home to see if it's p0.

Both of my boards were sequential though, if that helps.

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai

Ghostpilot posted:

Edit 2: Finally got a read on my VRM temps using the beta version of HWiNFO64. The minimum was 34c and the max was 57c.

Edit 3: Seems that the culprit of my Gelid's fan noise is a loose fan mount on the heatsink, which is something that shouldn't be difficult to correct.

Was this for both sets of VRMs, or just one? From what I've seen on various forums, people report that VRM 1 (believed to be the small group of VRMs near the IO plate) starts out cool but quickly spikes into the 90s upon running Furmark or even playing games.

It's also strange that GPU-Z isn't picking it up for you, because it picks up both of them for me. Are you sure you have the latest version?

Still, that's much better than I was expecting. Despite the installation process (which seems like hell on any aftermarket you get for the 290), would you recommend the Gelid?

Gunjin
Apr 27, 2004

Om nom nom

Sormus posted:

GPU Megathread - Live vicariously through someone else's purchases

Not gonna lie, that's what I'm doing. Thanks to kids and medical bills I'm stuck on a C2D E6550 (that won't overclock) with a HD5770. I haven't played a game at higher than minimum graphics in a couple years.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Gunjin posted:

Not gonna lie, that's what I'm doing. Thanks to kids and medical bills I'm stuck on a C2D E6550 (that won't overclock) with a HD5770. I haven't played a game at higher than minimum graphics in a couple years.

I'll pour 40min into Crysis 3 on my 650Ti tonight in your honor while waiting on the big card to get here. The bigger card. Biggest, for now, I guess. Gotta remember our brethren fellas.

Agrajag
Jan 21, 2006

gat dang thats hot

Agreed posted:

I'll pour 40min into Crysis 3 on my 650Ti tonight in your honor while waiting on the big card to get here. The bigger card. Biggest, for now, I guess. Gotta remember our brethren fellas.

Aren't the 800 series cards supposed to be rolling out relatively soon? Apparently the 800 series will actually have new tech/architecture, right?



VVV Haha, you bought a Xbone.

Agrajag fucked around with this message at 21:37 on Nov 20, 2013

mayodreams
Jul 4, 2003


Hello darkness,
my old friend
My last two cards were flagship launch purchases (5870 and 680), but you guys take it to a new level. :v: The fact that I paid more for my EVGA 680 than the 'overpriced' Xbone puts pc gaming into perspective sometimes.

Gunjin
Apr 27, 2004

Om nom nom

Agreed posted:

I'll pour 40min into Crysis 3 on my 650Ti tonight in your honor while waiting on the big card to get here. The bigger card. Biggest, for now, I guess. Gotta remember our brethren fellas.

I forgot to mention the best part of it, I have it crammed into one of these:

http://www.silentpcreview.com/article134-page1.html

Just look at that amazing box art. That dates back to my overclocked 4200Ti. Now that was gaming power.

Gunjin fucked around with this message at 21:47 on Nov 20, 2013

veedubfreak
Apr 2, 2005

by Smythe

Agrajag posted:

Aren't the 800 series cards supposed to be rolling out relatively soon? Apparently the 800 series will actually have new tech/architecture, right?



VVV Haha, you bought a Xbone.

The next gen cards will be 20nm and TMSC is having issues with it due to *things* last I heard. Word is probably 4th quarter of 2014. I'm betting I'll get a solid year out of my 290s if not more.

Heh, Xbone.

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Rahu X posted:

Was this for both sets of VRMs, or just one? From what I've seen on various forums, people report that VRM 1 (believed to be the small group of VRMs near the IO plate) starts out cool but quickly spikes into the 90s upon running Furmark or even playing games.

It's also strange that GPU-Z isn't picking it up for you, because it picks up both of them for me. Are you sure you have the latest version?

Still, that's much better than I was expecting. Despite the installation process (which seems like hell on any aftermarket you get for the 290), would you recommend the Gelid?

That's hard to say without knowing what's involved with the others. The only other I've seen anyone attempt (on air) was the Arctic Accelero Xtreme III, which seems to share a lot of the Gelid's issues in regards to the ramsinks in addition to having to buy more above and beyond what comes with the cooler. At that point you're nearing the cost of two Gelids.

Something that rubs me the wrong way with the Gelid vs the Arctic, is that the Gelid officially says that it supports the 290/x. Nothing that is officially supported requires that much jury-rigging and a forum guide to get running.

All that said, if you can make it past the horrendous installation, it does the job well and it's certainly the cheapest of your options. If you have to get an aftermarket right now and are okay with spending several hours on it, then it'll serve you well.

If you can live with the stock AMD until proper 290/x solutions arrive, then I'd go that route.

As for the VRM's, I'll take a screenshot once I get back.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Agreed posted:

I'll pour 40min into Crysis 3 on my 650Ti tonight in your honor while waiting on the big card to get here. The bigger card. Biggest, for now, I guess. Gotta remember our brethren fellas.

I mentioned this in an email but War Thunder on Ivy Bridge APU baby :pcgaming:

Truly, we are now in the era of powerful integrated graphics. At least, when there are developers that care to optimize for a larger number of players. Looking at you, :pgi:.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Agrajag posted:

Aren't the 800 series cards supposed to be rolling out relatively soon? Apparently the 800 series will actually have new tech/architecture, right?



VVV Haha, you bought a Xbone.

GbtYCS (oh, wait...)

:smaug:

As far as the release timing goes, last I heard the rumor mill (so take it with a grain of salt) has them dropping 28nm production Maxwell cards some time in late Q1/early Q2 2014 at 28nm, with the node shrink happening in late 2014. But nVidia themselves haven't been especially noisy about Maxwell, probably because advertising production difficulties with TSMC isn't going to help anybody's bottom line and they already have a product that counters AMD's next-gen cards within the Kepler architecture. Maxwell is going to have some pretty damned cool stuff, yeah, but there are a lot of uncertainties about exactly how said cool stuff will be utilized. The one after Maxwell looks pretty boss, integrating stacked DRAM on-die for absurd memory bandwidth at that particular point, but just like the Xbox One, or for that matter just practical computing in general, I reckon there will still be substantial limiting factors of bottlenecks around the really fast bit making the really fast bit slow down and wait. Not sure what AMD is up to after GCN 1.1, they have made motions in the direction of it being an architecture intended to last a long time, I'd speculate because they'll want to do what they can to leverage their win in the consoles as much as possible and get while the getting's good.

All that said, unless something REALLY dramatic happens with Maxwell's release, it seems very unlikely that there will be something that offers a significant performance improvement over fully enabled GK110 - I'm all ears (:allears:?) for them to prove me wrong, but I'm thinking that a 780Ti ought to have until the 20nm shrink of Maxwell of being nVidia's most powerful single GPU card for gaming, provided that the rumor mill isn't just stuck in bullshit mode. I would expect further efficiency improvements with Maxwell, though, they've really doubled down on that and you can see how well it's done for them in the progression from Tesla to Fermi to Kepler in terms of perf/watt. AMD seems weirdly satisfied with less efficient but just brute-force powerful cards, and as a result they've got more transistors that all run hotter in a smaller total area and boy howdy does it run hot.

I don't understand that particular choice, to be honest. This is sub-101 level stuff, resistance increases as temperature increases, which results in an increase in voltage since V=IR. So by choosing to design them for a 95ºC thermal target, they know they're making less efficient cards. Every other ASIC on the planet is moving in the direction of efficiency at the speed of necessity, I can't see HPCs looking favorably on cards that have relatively poor performance per watt when power is, generally, the biggest expense involved in supercomputing, from both the "it takes this much power to run them" and from the "it takes this much power to cool them" directions.

AMD hasn't ever been a top pick for HPC, nVidia's developer relations dwarf theirs and if they're going to face serious competition in that space it'll be from Intel (or a shift toward even more highly parallel, lower power designs, but Intel and nVidia both seem ready to answer that as development moves forward). Even so, AMD is making an odd choice here, in the long run. Maybe they'll pull a Fermi and fix the power problem with a refresh generation. The GTX 400s ran hot as hell, and nVidia did a lot to fix that with the release of the 500s (wasn't all about enabling one more hot-clocked SM, after all :v:). AMD ... probably ought to do that, if they can. And they don't need to wait for 20nm to do it, either... If they can.

Edit: It's kind of double-baffling since nVidia is doing HSA with Maxwell, something AMD has been working on practically for some time, though I share Professor Science's :raise: at the idea of moving workloads over PCI-e at that level of performance coherently. But nVidia doing with Maxwell what AMD did some time back does at least validate the notion as meritorious and not just AMD tilting at windmills. I'm quite curious about the "how" all the same. :)

Agreed fucked around with this message at 22:56 on Nov 20, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Agreed posted:

I don't understand that particular choice, to be honest. This is sub-101 level stuff, resistance increases as temperature increases, which results in an increase in voltage since V=IR. So by choosing to design them for a 95ºC thermal target, they know they're making less efficient cards.



Heat transfer = -1 * thermal conductivity * temperature difference. You can move more heat between 95ºC and room temperature than you could between 80ºC and room temp. It's how they "got away" with keeping the same reference cooler (i.e. without using a cooler with better conductivity).

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

veedubfreak posted:

The next gen cards will be 20nm and TMSC is having issues with it due to *things* last I heard. Word is probably 4th quarter of 2014. I'm betting I'll get a solid year out of my 290s if not more.

Heh, Xbone.

My wording was poor in that I was just trying to make the point that these cards cost more than the new consoles. I refrain from buying consoles at launch, and when I do pick one up, I'm going for a PS4.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:



Heat transfer = -1 * thermal conductivity * temperature difference. You can move more heat between 95ºC and room temperature than you could between 80ºC and room temp. It's how they "got away" with keeping the same reference cooler (i.e. without using a cooler with better conductivity).

I think you misread me - I understand the equations involved plenty well, my question is not "how did they even do this?" but rather "why did they choose to engineer the Volcanic Islands architecture in this fashion even though it is going in the exact opposite direction of not only nVidia, but also literally everyone else, including their own CPU division?"

The answer might be as simple as they're embracing gamers because nobody else buys AMD cards. But I had the impression they wanted to make inroads into HPC, all the big talk about their firsts in terms of releasing X teraflops on a GPU, not choking the performance of the top end cards compared to nVidia's very different strategy there - but that's not going to happen with the relatively much poorer performance per watt compared to Kepler, let alone the efficiency improvements that nVidia has claimed for Maxwell. Well, not past the level at which some individual OpenCL developer might feasibly buy a R9 290 or 290x instead of a Titan, I guess, but that's a tiny market.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Gunjin posted:

C2D E6550 (that won't overclock)

That's a goddamn shame! I could run mine at 3.4GHz stable all day long! (Now it just languishes in a server, running at stock, that I only boot up now and then to run backups of my stuff).

Josh Lyman
May 24, 2009


My 7970 GHz Edition just came in. If I want to compare numbers between it and the 560 Ti it's replacing, is 3DMark still the benchmark of choice?

Josh Lyman fucked around with this message at 00:47 on Nov 21, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Yeah, 3Dmark seems to own the public's mind as epeen-ometer. For reference, I was pushing 10K in Firestrike with the 780 GTX that is now happily on its way to Sidesaddle, but my poor 650Ti, usually only used as a PhysX card, is all WHAT THE HELL ARE YOU DOING TO MEEE and squeezes out around 3350. Also, at the overclock I'm running it at, the cooler can't quite keep up with really demanding high-end features, it'll get right at 82ºC and start throttling. Never did that when all it had to deal with was 17%-24% gpu usage PhysX workloads, hah. I can see why the 650 Ti Boost has the full two-slot shroud cooler thing going on, this lil fella only has a 90mm fan over the GPU and VRAM from what I can see.

Note: Alereon has explicitly stated that we shouldn't devote time to posting 3Dmark scores, I hope I'm not breaking that rule by noting the difference between a $120 card and a $670 card in praxis in the same machine. I've launched my career as a professional GPU reviewer, now to contact AMD and see about getting a press sample...

Agreed fucked around with this message at 00:35 on Nov 21, 2013

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Rahu X posted:

Was this for both sets of VRMs, or just one? From what I've seen on various forums, people report that VRM 1 (believed to be the small group of VRMs near the IO plate) starts out cool but quickly spikes into the 90s upon running Furmark or even playing games.

It's also strange that GPU-Z isn't picking it up for you, because it picks up both of them for me. Are you sure you have the latest version?

Still, that's much better than I was expecting. Despite the installation process (which seems like hell on any aftermarket you get for the 290), would you recommend the Gelid?

Here are the VRM temps after a couple of runs of Shadertoymark at 2560x1440 and some time on Guild Wars 2 at an ambient temperature of 75f.

PC LOAD LETTER
May 23, 2005
WTF?!

Agreed posted:

"why did they choose to engineer the Volcanic Islands architecture in this fashion even though it is going in the exact opposite direction of not only nVidia, but also literally everyone else, including their own CPU division?"...but that's not going to happen with the relatively much poorer performance per watt compared to Kepler
Probably because its only ~50w more over the 780GTX, the people/market (ie. gamers, these aren't Firestream Hawaii cards) interested in these cards care more about performance bang vs buck, and TSMC/GF/<insert non-Intel foundry of choice> don't have much in the way of process improvements to offer now or anytime soon. Slap a better cooler on a 290 or 290X and it drops to 20-30w more power vs a 780GTX. The power issue is over blown and mostly FUD at this point. If they did their job properly on designing the card even the temp issue is over blown and FUD. Noise can definitely be a problem for some though. They definitely should've done a better job on the reference cooler any way you look at it.

The rumor mill is still saying mid-late 2014 for TSMC's 20nm chips to roll off the line and that the improvements over their 28nm process won't be all that impressive. At least initially. They've improved their 28nm process over time, I'm sure they'll do the same with their 20nm tech. Thing is I don't see them doing much of it in a timely manner before their next process is supposed to be ready...unless they're going to delay that too.

That would make a 28nm Maxwell reasonable to do for nvidia, but it'd probably also be a relatively hot and power hungry chip vs Kepler on that process.

The HPC Hawaii cards are going to have ungimped compute DP performance which is something that GCN is pretty good at. The performance/watt probably won't be "poor" at all vs. Kepler for those work loads. Power usage hasn't been the issue with AMD getting the HPC guys to use their hardware anyways, its software and developer support.

Haeleus
Jun 30, 2009

He made one fatal slip when he tried to match the ranger with the big iron on his hip.
So I want to splurge on a new GPU but the 780ti is a bit too expensive; I'm willing to buy something in the $500s-600 range. I'd like to know how a 780 superclocked ACX would compare in performance to a 290x.

Also, I haven't used an ATI card since the Radeon 9600 back in the day. Besides the heat issue (which is why if I do get a 290x I'll wait for a non-ref design cooler), is there any caveat to going AMD over Nvidia for 1080p gaming like inconsistent performance, driver issues, etc.?

woppy71
Sep 10, 2013

by Ralp
Guys, I have £65 to spend on a new graphics card for my PC (I'm a gamer on a low budget) to replace the current GT430 that I currently have installed.

My monitor has a maximum resolution of 1360 x 768 and to be honest, I'm not too worried about playing games at "ultra" settings and I'm more than willing to dial down the settings in order to get playable frame rates.

My current system has a Intel Core 2 Duo processor (not sure of the exact model, but it's the 2500mhz version) and 6GB of RAM

I've read a few reviews on budget graphics cards and one particular model seems to get mentioned quite a lot, namely the Radeon HD7750.

Do you guys think think this would be a reasonable upgrade, baring in mind my expectations, or are there better cards out there for around £65?

Your input would be greatly appreciated :)

GrizzlyCow
May 30, 2011
A HD7750 should be around twice as powerful as your current card according to AnandTech, and it should be perfectly capable of giving you around 60FPS at such low resolutions possibly even on those Ultra settings you don't really care about. The 7750 is about the baseline gaming card, and it is compatible with just about all computers nowadays. The closest comparable card is the GTX 650, but I don't know the pricing on that card.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I assume this is the best place for this question...

If I overclock an i7, do the integrated graphics become overclocked as well? Or is it possible to choose to overclock only that section of the chip?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Haeleus posted:

So I want to splurge on a new GPU but the 780ti is a bit too expensive; I'm willing to buy something in the $500s-600 range. I'd like to know how a 780 superclocked ACX would compare in performance to a 290x.

Also, I haven't used an ATI card since the Radeon 9600 back in the day. Besides the heat issue (which is why if I do get a 290x I'll wait for a non-ref design cooler), is there any caveat to going AMD over Nvidia for 1080p gaming like inconsistent performance, driver issues, etc.?

Both companies, at various times, have driver issues. Apart from that, the R9 290 is a better buy because it has almost no ground between it and the R9 290X - you're not paying for ~25% more shaders and texture units (combined figure - individually it's about 20% and about 7%, respectively) as with the 780 vs. 780Ti comparison. And you're not getting a big step up in VRAM clock speed either, from the 780's 1500MHz GDDR5 to the 780Ti's 1750MHz GDDR5 (6GHz vs. 7GHz effectively, which with the 384-bit memory bus works out to 288.4GB/sec on the 780 vs. 336GB/sec on the 780Ti). While the price premium on the 780Ti outpaces the performance gains, it does at least buy you some significant differences in shader and texel throughput as well as a nice boost in memory bandwidth.

Not so with the R9 290 vs. the R9 290X. Instead, you're paying for 10% more stream processors and 10% more texture units. Both 290 and 290x have 64 ROPs, 4GB of VRAM running at the same clock frequency, and the reference core clock difference for Powertune's boost clock is 947MHz on the R9 290 versus 1000MHz on the R9 290x. They are very, very nearly the same card, separated by $100 and a theoretical maximum performance difference of less than 10%, real-world even lower than that in most cases.

The R9 290 compares favorably in performance and value to the GTX 780, though with reference cooling the GTX 780 takes off ahead of both the R9 290 and the R9 290X when overclocking is taken into consideration. It's probably worth waiting for aftermarket coolers that are engineered from the ground up to take care of the higher transistor density and heat output of the Hawaiian Islands GPU and supporting circuitry to close the overclocking gap, as people who have gone with DIY aftermarket cooling have found them to be very capable overclockers, which brings them neck and neck again with the 780, Titan, and 780Ti.

It's probably the case that the best clocked 780Ti cards will outperform the best clocked R9 290X cards in games. It's also probably the case that the best clocked R9 290 cards will outperform the best clocked GTX 780s in games. This, of course, doesn't take into account the deviance expected with games that just run better on one company's architecture than the other, and in any case it's hard to say a lot in that direction firmly right now as robust overclocking for the R9 290 and 290x hasn't really happened yet.

Right now, the fight is more for value-add/opportunity cost, and features. Do you want quieter cards with much better reference cooling performance allowing for stock setups to overclock quite well? Features like Shadowplay, global FXAA without any hassle, TXAA in games that support it (not many), Adaptive V-Sync, PhysX (it can sorta be hacked in with AMD cards, but now games are locking it out explicitly, and really it's just some extra graphical flair, very few games have it and fewer are HOLY poo poo better with it), the games/Shield bundle, and better performance in games that are optimized more for nVidia hardware?

Or do you want noisier, less effectively cooled cards but with the upcoming possibility of Mantle being a solid force multiplier for AMD hardware in games that support it, TrueAudio offering GPU-accelerated positional audio in games that support it, a different and somewhat less competitive but still kinda neat games bundle, and better priced cards for the level of performance they offer (assuming that their fan profiles are aggressive enough to keep them clocked close to their max and you don't have to go all MacGuyver on them just to control temps to prevent throttling)?

Agreed fucked around with this message at 06:43 on Nov 21, 2013

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Zero VGS posted:

I assume this is the best place for this question...

If I overclock an i7, do the integrated graphics become overclocked as well? Or is it possible to choose to overclock only that section of the chip?

IA cores and IGP are overclocked (and volted) entirely separately. Notably, you can overclock the IGP on any chipset, not just Z- (or P-) ones.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply