Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

EVGA dampens the fan noise somewhat, it's not a 100% reference blower iirc, but it is easily a lot quieter in games than the 580. In fact, in games where the 580 does CUDA PhysX processing, I can't hear the 680's overclocked-as-far-as-it'll-go fan (even with the custom fan profile to begin aggressively cooling a 60ºC to prevent throttling) over the GTX 580's fan, and the 580 has a much less demanding overall workload and is at stock clocks and voltage. It just has a noisier fan design.

That said obviously the farther you get from the reference design, the more and bigger fans you add, the quieter it'll be. But I've got three 200mm fans that are pretty much silent, meaning I've got at least three 200mm x 200mm holes in my case, and the 680 is inoffensively noisy even when going full blast.

It also cools REALLY well, even when it's hitting power target >110% and giving the card a thorough workout it nonetheless doesn't get over about 63-64ºC with the custom fan profile. (Independently verified that at 70ºC it throttles by 13MHz, and again at 80ºC by 13MHz, but even in Metro 2033 which is the only game I'm running that will actually get the GPU fully engaged sometimes at max settings all it took was a custom fan profile that basically pegs fan speed% to temperature up to 55ºC, then goes to max from 55ºC to 60ºC to prevent it from getting hot and throttling).

It's a cool running card and anything based on the reference vapor chamber is going to cool well. More sophisticated/involved solutions can do the job quieter but I sincerely doubt better, barring BIOS modding and suicide-run super overclocks for 3Dmark e-peen.

Adbot
ADBOT LOVES YOU

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

Factory Factory posted:

Update to this:

First, it wasn't the US Government, technically it was a security group at my alma mater.

Second, AMD sulked a bit that it only affected users who screwed around with security settings, but will fix the bug.

Fairly sure someone on the higher up the food chain "fix it". While it doesn't affect the greater AMD, they don't want anything out there saying "AMD (the organization) is unsecure". Even if it a part of the drivers for video cards.

Much ado though, when people are actively defeating cryptographic hashes in windows update.

Baby Proof
May 16, 2009

Any thoughts on 7770 Crossfire now that single cards are finally dipping down below the $100 mark? I probably won't bite because there's always at least one game that won't work right in Crossfire/SLI, but the benchmarks look good compared to $200 cards and the power use is surprisingly low...

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Speaking as one who went with 6850s in CF, a single card is always preferable. There are just too many headaches and caveats and such associated with CF and SLI setups. The raw power is impressive, but you pay for it in odd performance problems, more rampant instability, constant driver/application profile updates, and additional noise.

Tunga
May 7, 2004

Grimey Drawer
I ran CF 6870s for a year. It has zero effect in some games, doesn't work at all when not in full screen mode, gives annoying microstutter, and sounds like there is a motorbike in your case. Don't do it.

I finally replaced mine with a 670 and could not be happier with the performance.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Baby Proof posted:

Any thoughts on 7770 Crossfire now that single cards are finally dipping down below the $100 mark? I probably won't bite because there's always at least one game that won't work right in Crossfire/SLI, but the benchmarks look good compared to $200 cards and the power use is surprisingly low...

Less than $100? Cheapest I can see on Newegg is $119.99 after rebate. For exactly double, you could take that money and get a 7850. Then you've got double the VRAM and no crossfire issues. I personally wouldn't invest in a new setup for gaming with 1GB VRAM, especially not for $240, even if the benchmarks are often ahead with 7700 CF.

HalloKitty fucked around with this message at 09:29 on Jun 12, 2012

The Gunslinger
Jul 24, 2004

Do not forget the face of your father.
Fun Shoe

Fortuitous Bumble posted:

What are people's experiences with the fan noise from the GTX 670 blower setup? My current Radeon 5770 makes this annoying whining/grinding sort of noise even at idle, I don't know if they all do that but I'd like to avoid that. I was originally looking at the Asus version to avoid the noise issue, but it seems to be perpetually out of stock and I saw this EVGA version with the 680 PCB for a bit cheaper but as far as I can tell it has the reference cooler. Or it might be the 680 cooler, if there's any difference.

The reference blower on the 670 isn't exactly their best work, it's fairly audible. I don't hear mine with my headphones on or anything but when I went to answer the phone the other night while running Skyrim it was putting out a fair bit of noise. It seems slightly louder than my 560 Ti was but it's nothing obnoxious. No idea about that specific card though.

Tunga
May 7, 2004

Grimey Drawer
The ASUS DCII cooler on my 670 (I have the TOP variant but it's the same cooler) is ridiculously quiet, it's actually quieter than my case fans.

Boten Anna
Feb 22, 2010

Is there any hope in the nearish future for SLI without all the wacky restrictions? Like when my 670 is starting to chug is there any chance I'll be able to smack in another one and keep at it without all this annoying "full screen only" malarkey and poo poo drivers, maybe thanks to things like the 690, or is it kind of hosed for the forseeable future?

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Boten Anna posted:

Is there any hope in the nearish future for SLI without all the wacky restrictions? Like when my 670 is starting to chug is there any chance I'll be able to smack in another one and keep at it without all this annoying "full screen only" malarkey and poo poo drivers, maybe thanks to things like the 690, or is it kind of hosed for the forseeable future?

It's the way SLI works, I'm afraid.

The Gunslinger posted:

The reference blower on the 670 isn't exactly their best work, it's fairly audible. I don't hear mine with my headphones on or anything but when I went to answer the phone the other night while running Skyrim it was putting out a fair bit of noise. It seems slightly louder than my 560 Ti was but it's nothing obnoxious. No idea about that specific card though.

Agreed was saying the one on his EVGA 680 is pretty quiet, I would assume the 670 is pretty much the same. All reviews say the 6xx blowers are considerably quieter than the 5xx ones. Maybe yours is just loud?

The Gunslinger
Jul 24, 2004

Do not forget the face of your father.
Fun Shoe

Dogen posted:

It's the way SLI works, I'm afraid.


Agreed was saying the one on his EVGA 680 is pretty quiet, I would assume the 670 is pretty much the same. All reviews say the 6xx blowers are considerably quieter than the 5xx ones. Maybe yours is just loud?

I'm not a sound guru or anything, it's not necessarily louder but its more noticeable if that makes any sense. TechReport kind of notes what I'm talking about. They say its more noticeable idle but I've found it more apparent under load. It's not a big deal or anything but its definitely not the quietest card I've owned. That said it's a noise/power/cost ratio and I don't think its so bad that people should force themselves to get a more expensive third party cooler design or something.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

The cooler on my EVGA may not be totally reference. It's got sound dampening stuff on the fan itself. Is that a reference quality? I dunno. Definitely shitloads quieter than the 580 that's also idling below it and running full-bore for physX crap.

Boten Anna
Feb 22, 2010

Dogen posted:

It's the way SLI works, I'm afraid.

Does it work basically by having each card draw different elements on the screen and them merging them together, using hardware or even just the raw video signal somehow?

Is it possible in the future that the connection between the two will be more of a... for a lack of a better word, logical link that basically just uses additional cores to throw more hardware at the rendering similar to a multi-core CPU?

I'm probably phrasing this in all kinds of terrible ways what with having an only rudimentary understanding of how any of this works under the hood.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Boten Anna posted:

Does it work basically by having each card draw different elements on the screen and them merging them together, using hardware or even just the raw video signal somehow?

Is it possible in the future that the connection between the two will be more of a... for a lack of a better word, logical link that basically just uses additional cores to throw more hardware at the rendering similar to a multi-core CPU?

I'm probably phrasing this in all kinds of terrible ways what with having an only rudimentary understanding of how any of this works under the hood.

1) Yes, basically. I'm not sure if my memory holds up on this or is in fact accurate anymore, but assuming it's the same tech as it was back when I had 2 voodoo 2s, basically one card draws half the lines, and the other draws the other half- each card draws every other line, hence "scan line interweave"

2) I think this is how crossfire works? But not how SLI works. Someone else can probably do a better job.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Boten Anna posted:

Does it work basically by having each card draw different elements on the screen and them merging them together, using hardware or even just the raw video signal somehow?

Is it possible in the future that the connection between the two will be more of a... for a lack of a better word, logical link that basically just uses additional cores to throw more hardware at the rendering similar to a multi-core CPU?

I'm probably phrasing this in all kinds of terrible ways what with having an only rudimentary understanding of how any of this works under the hood.

Toms Hardware actually has a decent explanation of this stuff, including some explanation of microstutter, SLI/Xfire's not so pleasant side.

http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,2995.html

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
That really didn't explain it at all!

The wikipedia on SLI is pretty good, though. Apparently it's totally different now and works in a different way. The abbreviation is the same, but what it stands for is different. Oh, well.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
I was gonna write this whole thing but then I Googled up Hardware Secrets having already done the work. Clicky the linky for an exciting delve into the meanings of such letter salad as AFR, SFR, AFR of SFR, SLI AA, Scissors, Supertiling, and Super AA.

It's vintage 2008, though, when CF and SLI were The New Big Thing rather than boring commonplace e-peen inflators.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Boten Anna posted:

Does it work basically by having each card draw different elements on the screen and them merging them together, using hardware or even just the raw video signal somehow?

Is it possible in the future that the connection between the two will be more of a... for a lack of a better word, logical link that basically just uses additional cores to throw more hardware at the rendering similar to a multi-core CPU?

I'm probably phrasing this in all kinds of terrible ways what with having an only rudimentary understanding of how any of this works under the hood.

To your first question: yes. There are several different rendering modes, but the end goal is to split work between the GPUs at a high level and then recombine the images. They can switch off alternate frames, or split the screen (horizontally, vertically, or in a checkerboard pattern) and each contribute to the same frame.

It's certainly possible on a theoretical level to couple GPUs together closely. After all, a modern GPU is basically a big pile of dumb processors. The problem is bandwidth. On the same silicon die, coupling all those processors together and to the support hardware they need (memory controllers, rasterizers, etc) is difficult but certainly possible. On the same circuit board, it's exceptionally difficult. AMD tried to implement something like this a while back, in order to improve the performance of "SLI on a stick" dual-GPU cards, and dedicated a bunch of die space to it - but nothing using it ever came to market, and the feature was dropped in the next generation. Across a PCIe link and SLI/Crossfire bridge, it's a "you're loving kidding, right?" problem - there's just not enough bandwidth. And, even though very smart people are continually working to develop higher-bandwidth interconnects, bandwidth requirements for GPUs keep going up, too. It's simpler, cheaper, and still works fairly well to just duplicate memory contents and ask each GPU to contribute a chunk of a fully rendered frame.

Dogen posted:

That really didn't explain it at all!

The wikipedia on SLI is pretty good, though. Apparently it's totally different now and works in a different way. The abbreviation is the same, but what it stands for is different. Oh, well.

The theory behind how it works is actually pretty close: each card renders a lower-resolution image that contributes to the whole. It's just not broken up by scan lines any more, because modern video cards don't have the same passthrough access to video output (and convenient sync signals) that the Voodoo2s did.

The name is Nvidia just blatantly cashing in on an old nostalgic trademark, though.

KillHour
Oct 28, 2007


That Wikipedia page is pretty out of date...

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Goon project: Update that Wiki page. I'll create a Wiki for it.

GRINDCORE MEGGIDO
Feb 28, 1985


ijyt posted:

Overclockers UK on the other hand, are a steaming pile of poo poo.

I had real problems with them.

Whats the current state of switchable graphics on a desktop?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

wipeout posted:

I had real problems with them.

Whats the current state of switchable graphics on a desktop?

Non-existent. The closest thing is Lucid Virtu/MVP, which can have some similar benefits but is pretty much completely different in how it works.

Sagebrush
Feb 26, 2012

In terms of general gaming performance, what's a rough desktop equivalent to a GTX660M/2GB? How about a 640M? A Quadro K2000M?

Work is buying me a new laptop, and those are the cards in some of the options I'm considering...wondering how they'd compare to the 9800GTX/512 in my desktop right now. (Yeah, I need to upgrade).

Mr.Fuzzywig
Dec 13, 2006
I play too much Supcom
So im building a new computer, and because i'm fairly insistent that everything run at max settings, i was going to get a 680, but i hear now that the 670s are almost the exact same card. Is the performance boost big enough to justify another 100$ or so?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Sagebrush posted:

In terms of general gaming performance, what's a rough desktop equivalent to a GTX660M/2GB? How about a 640M? A Quadro K2000M?

Work is buying me a new laptop, and those are the cards in some of the options I'm considering...wondering how they'd compare to the 9800GTX/512 in my desktop right now. (Yeah, I need to upgrade).
There are no desktop equivalents for any of those cards because they haven't launched yet. It's hard to draw comparisons to older desktop cards because they are a completely different architecture, but I would say that the GTX 660M would be much faster, the GTX 640M has a faster GPU but less than half the memory bandwidth (meaning it will be faster at low-res/light duty but performance falls off a cliff), and I can't find anything about what a Quadro K2000M is.

Mr.Fuzzywig posted:

So im building a new computer, and because i'm fairly insistent that everything run at max settings, i was going to get a 680, but i hear now that the 670s are almost the exact same card. Is the performance boost big enough to justify another 100$ or so?
Just get the 670, performance is within a few %, especially overclocked. It should be absolutely zero effort to raise the power cap to 122% and add the +91Mhz clock offset to boost to the same clocks as the GTX 680, and the extra shaders don't make a big difference. I'm really happy with my choice of the GTX 670.

Alereon fucked around with this message at 02:23 on Jun 13, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
660M - About a GeForce GT 640. ~9% slower core clock but ~12% faster RAM clock, no idea how that plays out in benchmarks.
640M - Half to 2/3 a 660M when memory bandwidth limits are involved, I'd say around a GeForce 440/630.
K2000M - Slightly slower 650M. Draw your own parallels.

Mr.Fuzzywig
Dec 13, 2006
I play too much Supcom

Alereon posted:

Just get the 670, performance is within a few %, especially overclocked. It should be absolutely zero effort to raise the power cap to 122% and add the +91Mhz clock offset to boost to the same clocks as the GTX 680, and the extra shaders don't make a big difference. I'm really happy with my choice of the GTX 670.

Thank you very much, does this http://www.newegg.com/Product/Product.aspx?Item=N82E16814130787 look like a acceptable card? ill admit i just picked the first result off newegg but this one seems to have higher clock speeds?

Somebody fucked around with this message at 03:50 on Jun 13, 2012

Chuu
Sep 11, 2004

Grimey Drawer
I'm considering upgrading a computer with a 1680x1050 monitor to a 2560x1440 monitor. It's currently a Core i5-2500 /w a Radeon 4850, and I'm happy with the graphics as in.

How much am I realistically going to have to spend to keep a similar level of quality if I want to game at native resolutions?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Mr.Fuzzywig posted:

Thank you very much, does this http://www.newegg.com/Product/Product.aspx?Item=N82E16814130787 look like a acceptable card? ill admit i just picked the first result off newegg but this one seems to have higher clock speeds?
Remember that time I accidentally edited my reply into your post? Edit != quote is apparently pretty easy for mods :shobon: Anyway, I would recommend against factory-overlocked cards due to the difficulty testing overclocks for stability on the GTX 600-series. I have the base EVGA GTX 670 card and I am very happy with it. It has some very, very minor tweaks over the reference design that should improve cooling and noise by an immeasurably small amount.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Chuu posted:

I'm considering upgrading a computer with a 1680x1050 monitor to a 2560x1440 monitor. It's currently a Core i5-2500 /w a Radeon 4850, and I'm happy with the graphics as in.

How much am I realistically going to have to spend to keep a similar level of quality if I want to game at native resolutions?

1440p, you're looking at a GTX 670 or a Radeon 7950/7970, depending on your preferred card maker. nVidia is winning this generation but it's not a total beatdown or anything, 7950s/7970s overclock like CRAZY and they have great per-clock performance, meaning less overclock goes farther. They're also not at all bandwidth limited, thanks to the 384-bit memory bus, so you can pretty much go hog wild with the GPU/shaders clock and add memory as an afterthought. With the 670/680 you have to balance a solid memory overclock with a solid core overclock or else you won't get all the performance that the core has to offer, and that balancing act is a bit of a hassle since there isn't a good way to test it other than playing games and hoping you guess well.

Alereon posted:

Remember that time I accidentally edited my reply into your post? Edit != quote is apparently pretty easy for mods :shobon: Anyway, I would recommend against factory-overlocked cards due to the difficulty testing overclocks for stability on the GTX 600-series. I have the base EVGA GTX 670 card and I am very happy with it. It has some very, very minor tweaks over the reference design that should improve cooling and noise by an immeasurably small amount.

Yeah, I got the SC+ for one reason and one reason only, it was the actual price of the card for approximately five minutes and I was ready to pull that trigger if any card became available for a non-scalper price. The 670's disabled SMX means gently caress all, really, in terms of performance, and you can have a look at graphs which show that it seems to open some overclocking headroom at that, which just makes keeping up with a 680 all the easier. I was lucky to receive a sample which has what would be from the stock GTX 680 factory settings more than a +100 core and +400 memory overclock, without any modification to the BIOS to disable overvoltage protection or anything like that.

A lot of people buying EVGA Superclocked cards are finding they perform as advertised and not much wiggle room above that. So no point playing the lotto on them if you can find a baseline card instead. They do have some designs with more robust VRM, but it's not necessary unless you're going for [H]-level overclocks that aren't gonna do much for you. nVidia didn't just say "eh, gently caress it" when they reduced the VRMs from 5 to 4. It only needs 4 and that works just fine. Very sophisticated power management in the card's hardware takes care of everything nicely, and the cooler is extremely good.

I also think that EVGA has some modification to the reference cooling design - it's the blower style, but there are removable acoustic dampening pads in important places and it is far quieter than the previous gen cooler despite being very similar in design (in other words, from nVidia themselves, you'd expect any noise improvements to come from Kepler generation's incredibly effective power management which keeps the card's power draw to where it actually needs to be for the workload, because they already introduced the great stock vapor chamber cooling setup with Fermi and blowers haven't changed much in a long time, just fans capable of taking in a lot of air despite cramped conditions, and with very long projected MTBF).

My EVGA GTX 580 SC is a lot noisier qualitatively than my EVGA GTX 680 SC+. I have not experienced any issues that others have commented on about the actual frequency of the fan noise being more annoying, that was brought up in a Techreport review but it doesn't match my experience at all, apples to apples with a very similar cooler type from the same maker. The 580 does not have the noise-reduction doodads on the fan.

Agreed fucked around with this message at 04:58 on Jun 13, 2012

Mr.Fuzzywig
Dec 13, 2006
I play too much Supcom
SO this http://www.newegg.com/Product/Produ...ID=3938566&SID= would be an regular card as opposed to the Factory Overclocked one?

Fortuitous Bumble
Jan 5, 2007

Mr.Fuzzywig posted:

Thank you very much, does this http://www.newegg.com/Product/Product.aspx?Item=N82E16814130787 look like a acceptable card? ill admit i just picked the first result off newegg but this one seems to have higher clock speeds?

This is the card I was asking about earlier, I ended up ordering one off Amazon. It's a lot quieter than my 5770 or 8800GT which both had similar coolers and made annoying sounds at idle, this one is completely silent when the computer isn't doing anything which is what I was hoping for. I haven't tested it enough to say if it actually gets unstable, but I imagine it could always be underclocked down to normal if it is. I couldn't say exactly what the difference between the hardware is though.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Mr.Fuzzywig posted:

SO this http://www.newegg.com/Product/Produ...ID=3938566&SID= would be an regular card as opposed to the Factory Overclocked one?
Yes, I believe that's the exact card I have.

Seven Round Things
Mar 22, 2010
It seems AMD have been releasing renamed version of old products lately, and this has confused me- specifically, I was told the 6770 is a renamed 5770. Can someone explain just how my 5770 compares to AMD's current lineup?

is that good
Apr 14, 2012
The 7750 is a bit worse than the 6770 (which is equal to the 5770), but runs cooler and uses so little power that it does not need a PCIe power cable (though there is an upclocked version that uses the cable and should be somewhat more powerful). The 7770 is a bit worse than the 6850 but again uses less power. The 7850 is approximately equal to the 6950.
They all also have assorted extras and may possibly improve in performance through future driver updates.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Here's one extra that matters - way better DX11 performance, this is the generation that ATI really got with DX11 in my opinion. Though in-game implementations still tend to somewhat favor nVidia's approach to DX11 (they have, for example, practically no performance hit when enabling tesselation, though ADOF can still be a hog), ATI is no longer a second-rate performer in all but synthetic benchmarks.

Wish I could find a better example, but Heaven's okay for demonstrating what I mean, I guess. Check the 7850 out vs the 570 vs the 6950:

http://www.techradar.com/reviews/pc-mac/pc-components/graphics-cards/amd-radeon-hd-7850-1068373/review/page:2#articleContent

Grim Up North
Dec 12, 2011

I'm looking for a entry-level NVIDIA GPU to develop double precision CUDA software on. Now this is not for production, so I don't really need much horsepower (and I have no use for the actual graphics part at all) but don't want to get a card with an out-of-whack price/performance ratio either. However I don't really now where to look for kind of general double precision CUDA benchmarks.

Would a GTS 450 GDDR5 (shipped for 80€) be an OK buy? Thanks.

Rap Game Goku
Apr 2, 2008

Word to your moms, I came to drop spirit bombs


Grim Up North posted:

I'm looking for a entry-level NVIDIA GPU to develop double precision CUDA software on. Now this is not for production, so I don't really need much horsepower (and I have no use for the actual graphics part at all) but don't want to get a card with an out-of-whack price/performance ratio either. However I don't really now where to look for kind of general double precision CUDA benchmarks.

Would a GTS 450 GDDR5 (shipped for 80€) be an OK buy? Thanks.

Doesn't CUDA require like a 570 or better?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Grim Up North posted:

I'm looking for a entry-level NVIDIA GPU to develop double precision CUDA software on. Now this is not for production, so I don't really need much horsepower (and I have no use for the actual graphics part at all) but don't want to get a card with an out-of-whack price/performance ratio either. However I don't really now where to look for kind of general double precision CUDA benchmarks.

Would a GTS 450 GDDR5 (shipped for 80€) be an OK buy? Thanks.
Yes, though you might consider a Kepler card instead or as well, since it's just different enough an architecture to require slightly different optimizations. If performance truly isn't important, than a GTS 450 is probably overkill, but, y'know, whatever?

CUDA price/performance ratios usually kick in when comparing GeForce 560 Ti-448, 570, or 580 to a Fermi-based Quadro. Anything else and CUDA price/performance is just awful since the CUDA performance is so low. At that point, basically anything works for "Will it run?" testing.

Adbot
ADBOT LOVES YOU

Grim Up North
Dec 12, 2011

Factory Factory posted:

CUDA price/performance ratios usually kick in when comparing GeForce 560 Ti-448, 570, or 580 to a Fermi-based Quadro. Anything else and CUDA price/performance is just awful since the CUDA performance is so low. At that point, basically anything works for "Will it run?" testing.

Thanks, I'd like to able to get "Will it run reasonably fast?" testing as well, so I should probably go for a 570. (The 560 Ti-448 prices are almost the same as the 570 prices).

Am I right in assuming that Kepler is completely uninteresting for double precision computing until the GK110 (Tesla K20) is released?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply