|
I'm looking for a entry-level NVIDIA GPU to develop double precision CUDA software on. Now this is not for production, so I don't really need much horsepower (and I have no use for the actual graphics part at all) but don't want to get a card with an out-of-whack price/performance ratio either. However I don't really now where to look for kind of general double precision CUDA benchmarks. Would a GTS 450 GDDR5 (shipped for 80€) be an OK buy? Thanks.
|
# ¿ Jun 13, 2012 18:26 |
|
|
# ¿ Apr 28, 2024 00:38 |
|
Factory Factory posted:CUDA price/performance ratios usually kick in when comparing GeForce 560 Ti-448, 570, or 580 to a Fermi-based Quadro. Anything else and CUDA price/performance is just awful since the CUDA performance is so low. At that point, basically anything works for "Will it run?" testing. Thanks, I'd like to able to get "Will it run reasonably fast?" testing as well, so I should probably go for a 570. (The 560 Ti-448 prices are almost the same as the 570 prices). Am I right in assuming that Kepler is completely uninteresting for double precision computing until the GK110 (Tesla K20) is released?
|
# ¿ Jun 13, 2012 19:03 |
|
Colonel Sanders posted:Probably good for bitcoin mining. . . I know you said that in jest, but I'd expect BitCoin mining to be a purely integer based computation and a scientific co processor, with its focus on double precision floating point math, not to be cost effective. And BitCoin mining was all about cost efficiency. On the other hand, I don't really know what the Pentium cores are there for.
|
# ¿ Jun 20, 2012 07:37 |
|
Whale Cancer posted:I'm looking to invest about $300 into a card for gaming. I'll be running an i5 3570 chip. I'm currently set on the 560 ti 448. Is there a better option at my price point? Wait for the GTX 660 Ti to come out in in three days. It will initially cost $300 and be quite a bit faster. (Read the posts above yours.)
|
# ¿ Aug 13, 2012 08:25 |
|
There are rumors on Chinese tech sites that the GTX 660 and the GTS 650 will be released on September 6th.
|
# ¿ Aug 14, 2012 18:37 |
|
These are actually the GTX 660 OEM specs and in NVIDIA-land OEM parts and retail parts sometimes have the same name, completely different innards and wildly different performance. I pretty sure we will soon see the the retail GTX 660 based on an new chip, as I can't believe they'll release another retail card with yet another GK104 binning.
|
# ¿ Aug 22, 2012 10:39 |
|
Factory Factory posted:Zotac - Newer but good brand reputation, not many Goon experiences, reviews well Just one data point, but I have a Zotac 8800 GTS 512 which is still going strong after god knows how many years.
|
# ¿ Aug 23, 2012 08:02 |
|
Avocados posted:What Nvidia cards are out right now that are comparable to the 5870 in terms of performance? I'd rather "replace" the card instead of upgrade, as funds are a little low. Not out right now, but the GTX 660 should launch in less than a week, be comparable to a Radeon HD 7850 and have a MSRP of $229.
|
# ¿ Sep 7, 2012 14:17 |
|
Edward IV posted:Wow. Though I wonder how Intel is dealing with the memory bandwidth issue that makes AMD's APUs need high speed memory. Intel apparently intends to die-stack some low-power DDR3 directly onto the CPU die, which allows for a very wide bus and would considerably reduce the bandwidth requirements for external RAM. E: Here's a cool image showing Sony doing it in the PS Vita: Grim Up North fucked around with this message at 18:06 on Sep 12, 2012 |
# ¿ Sep 12, 2012 18:01 |
|
Factory Factory posted:Radeon 7000 through 9800, X300 through X1950, HD 2000 to HD 8000... Radeon 4K 2870.
|
# ¿ Dec 20, 2012 09:55 |
|
Did y'all think that your card would survive going from 1080p to 4K? Pretty fun, if way to short article at Anandtech.
|
# ¿ Jul 1, 2013 15:40 |
|
Sindai posted:You should post that in the Xbone or PS4 threads because people have been claiming they probably won't really do 1080p because some previous gen games didn't really do 720p for ages now.
|
# ¿ Sep 24, 2013 19:48 |
|
Sir Unimaginative posted:I don't know if it's just 560 Ti chat, but I ran into a TDR recovery failure bugcheck (0x116) on 327.23. Yeah, I know that the 560 Ti is ancient by standards, but I'm really starting to hate NVIDIA's driver team's blatant disregard for 560 Ti owners.
|
# ¿ Sep 27, 2013 09:11 |
|
quote is not edit, damnit
|
# ¿ Sep 27, 2013 09:12 |
|
Yeah, sounds nice, but I'm a bit wary. Will this be a repeat of 3DVision where there's really only one display that supports it, and its great for gaming but poo poo for everything else?
|
# ¿ Oct 18, 2013 18:44 |
|
Sormus posted:Currently I've cancelled my order for a reference-cooler 290X and awaiting for an improved design one. Wait a week, the R9 290 might very well be worth the wait.
|
# ¿ Oct 30, 2013 11:35 |
|
Does ~Raptr by AMD~ automatically push your your data into the cloud? You have to opt-in into that, right?
|
# ¿ Oct 31, 2013 20:28 |
|
You know apart from hoping that they get my 50 bucks worth of silicon working again, I'm now really interested to finally hear what the hell is going on with the whole 560 Ti TDR stuff.
|
# ¿ Nov 1, 2013 17:15 |
|
Pretty, uh, hot card. A loud and hot card. But with a custom cooler on it, it's actually an awesome card. Here's a test with an Accelero Xtreme III: (290 "Uber" is them setting fan speed to 55%) (Lautstärke being noise level if the db(A) didn't tip you off.
|
# ¿ Nov 5, 2013 09:56 |
|
HalloKitty posted:The big question mark here is: what the hell happens to Titan? It's less capable in every single way than 780 Ti. It's a product with a fancy name and no market. It's still the entry-level compute card, is that no market?
|
# ¿ Nov 7, 2013 15:41 |
|
mayodreams posted:Cross posting this from the Wii U demise thread. Someone posted a chart with the relative performance of the consoles, and I added in the 780 TI and 290x. Interesting, but maybe a 280X would be a more appropriate card as that would be at least somewhat comparable cost wise.
|
# ¿ Nov 7, 2013 21:27 |
|
GrizzlyCow posted:Join them. Join them with their 780s, their R9 290X's, their aftermarket cooling, and their 3-Way setups. SLI with them, and buy two 120Hz 1440p monitors. Don't look back. Lower than a 770 lies the path of weakness. Be unconstrained by such trivial notions as value or common sense. Burn your money and be free. Burn, money, burn! On Mount Vesuvius the Titans crush all mortal sense. Burn, money, burn!
|
# ¿ Nov 16, 2013 20:24 |
|
I don't get it, are there really people who choose one card over another just because someone's lljk handle is printed on it?
|
# ¿ Dec 6, 2013 19:41 |
|
Agreed posted:If you're running a 400- or 500-series card, nab this sucker quick-like and see if you can finally play some recent releases worth a drat instead of being stuck on R316 and earlier drivers. While this driver seems to fix the problem in many maybe in the majority of cases, unfortunately in my case it didn't. I've been sending Nvidida the dump files and hope they keep working on it. Still, if you haven't tried it yet, it's definitely worth a shot.
|
# ¿ Dec 7, 2013 14:08 |
|
Agreed posted:They are selling an FPGA that will replace the scalar unit on compatible monitors (can't remember where I saw that specific detail, but it was with an nVidia guy). We don't know which monitors will be compatible yet, though. To be honest, the whole time I've been wondering if they really want to sell a board where you have crack open a monitor (made by a third party) that is not meant to be opened by end-users. That seems fraught with a whole lot of liability concerns and even if it's sold as a enthusiast, do it on your own risk kit, it could see it lead to negative PR. Anyways, offer me a Dell Ultrasharp compatible kit and I'll be interested.
|
# ¿ Dec 13, 2013 18:05 |
|
Nephilm posted:Maybe we'll start seeing the 800 series in April-June, or they could start releasing more 28nm Maxwell parts before then. Either way, announcements and rumors should start appearing in March. The last official word I've read was http://wccftech.com/nvidia-maxwell-geforce-gtx-750-ti-gtx-750-official-specifications-confirmed-60watt-gpu-geforce-800-series-arrives-2014/ posted:NVIDIA also confirmed during the conference that they are planning to introduce the GeForce 800 series which is fully based on the Maxwell architecture in second half of 2014. This means that we will see the proper high-performance GPUs such as the replacements for GeForce GTX 780, GeForce GTX 770 and GeForce GTX 760 in Q3 2014. We have already noted codenames of the high-end Maxwell chips which include GM200, GM204 and GM206, however NVIDIA didn’t mention what process they would be based on but early reports point out to 20nm. I'm not sure if NVIDIA just confirmed second half of 2014 or Q3 but I wouldn't expect high-end parts in Q2.
|
# ¿ Feb 20, 2014 19:47 |
|
Heise (article in German) apparently heard at the CeBIT that 20nm GPUs are coming out in August (AMD) at the earliest and NVIDIA is talking about Q4'14 and even Q1'15. Nothing too official but it seems that the 780Ti-havers itt can pat themselves on their backs.
|
# ¿ Mar 11, 2014 16:13 |
|
Ignoarints posted:looks better than mine Holy poo poo, did your PC metastasise? Anyway, I think you should post your room in PYF goon lair, your first picture is very promising.
|
# ¿ Mar 12, 2014 19:35 |
|
Here's a fun link I haven't seen posted on here: http://richg42.blogspot.de/2014/05/the-truth-on-opengl-driver-quality.html It's Rich Geldreich's impression of various OpenGL drivers. quote:Vendor A quote:Vendor B quote:Vendor C
|
# ¿ May 19, 2014 15:53 |
|
Factory Factory posted:Sorry geez! Maybe it's noon Pacific instead. Wasn't it the 19th for the actual NDA lift/paper launch, and today some general Maxwell stuff? Or was that just rumours?
|
# ¿ Sep 8, 2014 18:05 |
|
veedubfreak posted:Guys, I'm allowed to buy a 980 for my triple 1440 set up right? I wouldn't want to get anyone's panties in a bunch. That's like two 4K displays worth of pixels. Shouldn't you be buying two or more of them?
|
# ¿ Sep 22, 2014 19:27 |
|
Hamburger Test posted:imgur doesn't accept .svg and I'm too lazy to find a program to convert them unless you really want to see them. https://mediacru.sh/ I'd be interested.
|
# ¿ Sep 24, 2014 18:39 |
|
Fajita Fiesta posted:What happened with that announcement that AMD was teasing on the 25th? OpenCL 2.0 driver support.
|
# ¿ Sep 28, 2014 21:28 |
|
Khagan posted:Where do these supposed Sisoft Sandra results for a 390X put it in relation to other GPUs? If the results are valid and the CUs are the same like in GCN it has roughly 50% more shaders. The RAM with the new HBM (4096 bit bus) is a bit harder to talk about - one the one hand bandwidth is roughly double, but I don't think current-gen GCN chips are especially memory bandwidth starved. I'd say it should come out at 50% faster than a R290X. E: Here is a Sisoft entry for big maxwell (GM200): http://www.sisoftware.eu/rank2011d/show_run.php?q=c2ffccfddbbadbe6deeadbeedbfd8fb282a4c1a499a98ffcc1f9&l=de +50% shaders, but no HBM, so probably not 50% faster than a 980. Grim Up North fucked around with this message at 16:40 on Nov 11, 2014 |
# ¿ Nov 11, 2014 16:35 |
|
Overclock your CPU, that's what the K chips are made for. And since you have a Sandy Bridge CPU and an Asus Z68 board the process will be super easy. Check the "Overclocking for Workgroups" thread.
|
# ¿ Dec 17, 2014 14:35 |
|
The new JPR numbers for discrete graphic card market share are out, and it is looking bad for AMD. Here is a graphic compiling the last thirteen years of (only) ATI/AMD vs NVIDIA with key releases for reference: I hope AMD can still innovate in GPU market and doesn't completely give up like they did with CPUs.
|
# ¿ Feb 25, 2015 13:37 |
|
Gibbo posted:How long am I going to have to wait for a 3gb 960? Assuming you want a 4GB one, it seems that Asus has confirmed that they will launch theirs this month. (But ^^^ it might not make sense to buy one.)
|
# ¿ Mar 13, 2015 10:05 |
|
Here's an interesting forum post on Mantle/Vulkan/DX12 by a somewhat well-known gamedev.quote:So why didn't we do this years ago? Well, there are a lot of politics involved (cough Longs Peak) and some hardware aspects but ultimately what it comes down to is the new models are hard to code for. Microsoft and ARB never wanted to subject us to manually compiling shaders against the correct render states, setting the whole thing invariant, configuring heaps and tables, etc. Segfaulting a GPU isn't a fun experience. You can't trap that in a (user space) debugger. So ... the subtext that a lot of people aren't calling out explicitly is that this round of new APIs has been done in cooperation with the big engines. The Mantle spec is effectively written by Johan Andersson at DICE, and the Khronos Vulkan spec basically pulls Aras P at Unity, Niklas S at Epic, and a couple guys at Valve into the fold.
|
# ¿ Mar 13, 2015 10:15 |
|
Space Gopher posted:Haha, no it's not. I thought that was but sometimes I really don't now anymore ...
|
# ¿ Mar 13, 2015 15:30 |
|
|
# ¿ Apr 28, 2024 00:38 |
|
1gnoirents posted:I hope this isn't true Seems to be the same prices heise.de, a usually reliable source, got to see at CeBIT. Both 390 and 390X faster than a 290X as expected. Nice to see that they are going for 8GB HBM VRAM.
|
# ¿ Mar 16, 2015 18:51 |