|
Agreed posted:This isn't the parts picking thread, and we JUST had a discussion of the best cards for given price points, but yeah it's the GTX 760. Performance does not hold up super well at >1200p resolution, but should you expect it to? It's a very solid card at a great price. You could also get its discontinued predecessor, the 660Ti, which is virtually the same card in performance terms but might be had on eBay or other outlets closer to $200, for an even better deal. Don't get gouged on it, though, you run that risk with cards that people bought thinking they might SLI down the line, one "risk" of that strategy. It's a bad strategy. Yeah sorry about that. I noticed the parts picking thread after I made that post. I'll get over there. But thanks for the tips regardless.
|
# ? Apr 10, 2014 14:04 |
|
|
# ? Apr 19, 2024 02:10 |
Agreed posted:This isn't the parts picking thread, and we JUST had a discussion of the best cards for given price points, but yeah it's the GTX 760. Performance does not hold up super well at >1200p resolution, but should you expect it to? It's a very solid card at a great price. You could also get its discontinued predecessor, the 660Ti, which is virtually the same card in performance terms but might be had on eBay or other outlets closer to $200, for an even better deal. Don't get gouged on it, though, you run that risk with cards that people bought thinking they might SLI down the line, one "risk" of that strategy. It's a bad strategy. I'm already spoiled by maxxxing out everything, including global settings that probably don't even do much. I'm ordering a 1440p today and I'm going to be sad when I have to turn things down, especially since I'll probably be at 85hz or more (and fps is king for me). time for 2x 770s
|
|
# ? Apr 10, 2014 14:37 |
|
Hey Agreed, what speed were you getting on your normal 780 before you sold it? I'm going to pop the one I got from microcenter in over the weekend and see how far it goes on air and decide if I should get a waterblock for it or not. My 290s don't overclock for poo poo and since I'm going back to single card, just wondering if an overclocked 780 or 290 is the one to keep. I tried 1100 on my core last night and started getting artifacts and driver crashes after just 2-3 games. Temps aren't my issue, the 290 just doesn't overclock for poo poo without giving it tons of voltage.
|
# ? Apr 10, 2014 16:04 |
veedubfreak posted:Hey Agreed, what speed were you getting on your normal 780 before you sold it? I'm going to pop the one I got from microcenter in over the weekend and see how far it goes on air and decide if I should get a waterblock for it or not. My 290s don't overclock for poo poo and since I'm going back to single card, just wondering if an overclocked 780 or 290 is the one to keep. I tried 1100 on my core last night and started getting artifacts and driver crashes after just 2-3 games. Temps aren't my issue, the 290 just doesn't overclock for poo poo without giving it tons of voltage. (sell the 290's for a 780ti?)
|
|
# ? Apr 10, 2014 16:06 |
|
veedubfreak posted:Hey Agreed, what speed were you getting on your normal 780 before you sold it? I'm going to pop the one I got from microcenter in over the weekend and see how far it goes on air and decide if I should get a waterblock for it or not. My 290s don't overclock for poo poo and since I'm going back to single card, just wondering if an overclocked 780 or 290 is the one to keep. I tried 1100 on my core last night and started getting artifacts and driver crashes after just 2-3 games. Temps aren't my issue, the 290 just doesn't overclock for poo poo without giving it tons of voltage. Why not wait for the 6GB versions? Don't you have a high res setup?
|
# ? Apr 10, 2014 19:02 |
|
Agreed posted:Translation: you really should stick to the appropriate thread because you're going to get "video card enthusiast" style recommendations here, as you can see from mine, clashing against "save money, here's how" style responses, as you can see from the other responder, and really it's just a hot mess since we don't know what your resolution is, we don't know what kind of settings you're after, etc. - things that are explicitly requested before making a rec in the appropriate thread. Honestly anything lower than a GTX760 or R9 270x just doesn't seem like a gaming card at all. For $200-220 that seems perfectly appropriate. I would either be using onboard or drop the $200 on a good card - not much sense going in the middle IMO.
|
# ? Apr 10, 2014 19:23 |
|
SlayVus posted:Why not wait for the 6GB versions? Don't you have a high res setup? The only time memory seems to become an issue is in multicard setups. Since no of the games I play support multicards I'm trying to decide which card to keep until the 20nm cards come out. Basically I'm just trying to minimize my costs by keeping a single card. Just haven't decided which. I can return the 780 within 30 days so that card is kind of a nonfactor if I decide not to keep it.
|
# ? Apr 10, 2014 19:28 |
SlayVus posted:Why not wait for the 6GB versions? Don't you have a high res setup? Honestly man, the only differences between 2 or 4 gb vram that I've seen (even in single cards) in game benchmarks is when you go 3 monitors wide. I'm becoming more and more convinced that a lot of vram (right now) isn't worth the money. Just one example http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/ It's even less of a difference in SLI actually, but I haven't come across much testing on that.
|
|
# ? Apr 10, 2014 19:53 |
|
Ignoarints posted:Honestly man, the only differences between 2 or 4 gb vram that I've seen (even in single cards) in game benchmarks is when you go 3 monitors wide. I'm becoming more and more convinced that a lot of vram (right now) isn't worth the money. Which is what veedub runs. (3x1440p)
|
# ? Apr 10, 2014 20:59 |
deimos posted:Which is what veedub runs. (3x1440p) Oh, I just figured because 1 video card... well drat heh. Well for anyone else who wants to know I guess
|
|
# ? Apr 10, 2014 21:03 |
|
Agreed posted:I'd be interested in hearing your reasoning behind this, especially considering that nVidia is absolutely certain at this point that they'll be fighting Intel in the HPC market at the same time and has a certain degree of inertia that forces their hand there (though it was a very clever move, in retrospect, to go from big hot-clocked SMs to SMXes with greater parallelization... though SMXes are a tad batshit in a "well it works fine, stop griping" kind of way). Not to mention it's not in keeping with, well, anything, frankly, that we've seen from either company. But before I say anything further I would prefer, if you don't mind, for you to note your thinking on the matter so we're not talking past each-other. The GK104 GPU that powered the GTX 680/770 is a gaming GPU. On the GTX 770 it offers 3.2 TFLOPs of single-precision compute performance, 33.5 GigaPixels/sec of ROP performance, and 134 GigaTexels/sec of texturing performance, at a cost of 3.54 Billion transistors, 294 mm^2 of die area, and 230 watts, with a launch price of $499 on the GTX 680 and $399 on the GTX 770. The GK110 GPU that powered the GTX 780 is an HPC GPU. On the GTX 780 it offers 3.977 TFLOPs FP32, 41.4 GP/sec, and 160 GT/sec, at a cost of 7 Billion transistors, 561 mm^2 of die area, and 250W, with a launch price of $650. For nearly twice the cost to nVidia, GK110 does not deliver anywhere near twice the performance of GK104 for gaming applications. Granted the GTX 780 was very heavily die harvested, but that is necessary to allow the GPU to reach yield and power targets and a hypothetical GM100 would face the same neutering for consumer cards. A major point of the Maxwell architecture is efficiency, both power and die-size, so I think a Maxwell-based GTX 880 would look more like a larger GM104 without the transistors dedicated to FP64 and the pins/board area for a memory bus wider than 256-bit. They can use two of those in a GTX 890 for applications where a single GTX 880 isn't enough, and we'll eventually see die-harvested versions of whatever the successor to GK110 is for Teslas.
|
# ? Apr 10, 2014 22:34 |
|
Alereon posted:... capabilities that are disabled on gaming cards costs a huge number of transistors that could either not be spent (lowering costs and raising clock headroom), or could be spent on things that do improve gaming performance. edit: also read this http://www.highperformancegraphics.org/previous/www_2012/media/Hot3D/HPG2012_Hot3D_NVIDIA.pdf Professor Science fucked around with this message at 23:57 on Apr 10, 2014 |
# ? Apr 10, 2014 23:47 |
|
Professor Science posted:just wanted to point out one thing here: GK110 is 20W higher than GK104 despite having 50% more memory and 2x the transistors. I actually think the extra gig of GDDR5 makes up the bulk of that disparity, but the thing to keep in mind is that the vast majority of those extra transistors in GK110 are just *off*. the FP64 stuff doesn't even get powered on, which means there's no static leakage (which is a huge chunk of your power costs at 28nm). it's not like they're giving up 3.5B transistors that could be powered and actively doing something useful for an app. Power really is the least of my concerns, though. Die area means yield, cost, and volume penalties. I think gamers would rather spend that area and those transistors on more FP32 SMXs and fixed-function hardware for graphics, and that nVidia knows they get the most cost-efficient product by going that route.
|
# ? Apr 11, 2014 01:56 |
Sold my monitor quicker than I though, won't have one until a qnix comes in at least I can see what I can manage with 1440p with 2gb cards. Spent the time repasting my 660tis finally and a little bit of cable management at least
|
|
# ? Apr 11, 2014 04:19 |
|
drat it I would be tempted to get a second ASUS 660Ti used and try SLI but then I remembered that my motherboard only supports Crossfire. But is crossfire still a wash on AMD cards? I ask because I do have two Sapphire HD7850 cards with one that I had in my main rig before upgrading to a 660Ti the other which is still in my HDTV rig but I hardly game on my HDTV rig anymore. Or would I get microstuttering in half of my games from frame pacing issues again like I was getting with a single HD7850?
|
# ? Apr 11, 2014 07:15 |
spasticColon posted:drat it I would be tempted to get a second ASUS 660Ti used and try SLI but then I remembered that my motherboard only supports Crossfire. I got a LITTLE stuttering at max settings in BF4 only. I'd hit anywhere frfrom 90-180(lol I know) fps but would dip well under 60 occasionally. Once I frame limited to my monitor's Hz basically it's smooth as silk. It seems to go either way with crossfire depending on the situation but generally its not as good. I had crossfire 7870's for like two weeks a while ago and it was really bad. If you want to SLI sell and swap your mobo
|
|
# ? Apr 11, 2014 13:29 |
|
spasticColon posted:drat it I would be tempted to get a second ASUS 660Ti used and try SLI but then I remembered that my motherboard only supports Crossfire. If you want to enter the realm of janky poo poo, there's been hacks for years to enable SLI on non-sli boards. I've never done it myself. http://www.overclockers.com/forums/showthread.php?t=638013
|
# ? Apr 11, 2014 13:46 |
Zero VGS posted:If you want to enter the realm of janky poo poo, there's been hacks for years to enable SLI on non-sli boards. I've never done it myself. Totally support janky poo poo (although it might not be worth buying a video card you can't return without knowing it worked first somehow) Ignoarints fucked around with this message at 15:13 on Apr 11, 2014 |
|
# ? Apr 11, 2014 14:15 |
|
Alereon posted:You're right that the TDP difference is not as significant between GK104 and GK110 as implemented on the GTX 780, but that's primarily because so much of the GTX 780 GPU is disabled. There's a 65W increase from the GTX 770 to the GTX 780 Ti running in 1/24 FP64, and the 780 Ti is actually clocked lower than the GTX 770 and almost identically to the GTX 780. Also, are you sure that the transistors providing FP64 are unpowered on 1/24 FP64 GK110? It's not like there are FP64 SMXs they can completely power-gate, all the SMXs are capable of 1/3 FP64, so while I agree you're obviously not paying for those transistors to be switching I don't think you're getting zero leakage from them. I'll admit I don't know enough about the low-level details of how the chip works to be confident here, though. Without getting into teeny tiny transistors and the 3D lithography involved in making 'em tick, I feel confident stating that your concerns as put forward are addressed adequately by 1. The strategy presented in the slides that Prof. Science linked, which favors performance and efficiency over reduction in die area (an issue that has notably affected AMD, especially when looking at the TDP figures for our little "preview" Maxwell cards - they seem with what I'm sort of stupidly calling "speculative certainty" to have initially planned their current crop around a process node shrink, and their transistor density and heat output is adequately explained by the failure of TSMC to meet that demand). 2. The practical reality of GK110 - it works fine, what's the issue? It's big? Dies are pretty big these days. Supply is not a problem. If this would be a problem, one would expect it to have shown itself as a problem by now.* 3. HPC first, gamers a long rear end time later. nVidia knows Pascal is going to be positioned against Intel. They'll have a card to meet it, or they'll fail. That card may not debut in consumer space, but rather follow the same "pure HPC --> workstation --> consumer" path that's been so successful for GK110, but I think referring to the needs of gamers as being a serious consideration is less ... poignant, at this particular juncture, than perhaps it could be. Right now it's tooth and nail in HPC and if Intel loses with Knight's Landing, it probably won't be by much and they're still in the game; it'd be a more devastating loss for nVidia. Games have adapted to what these little supercomputers we call graphics cards are good at for good reasons. It's symbiotic, serendipitous, and also totally necessary. So that works out great for everyone. *What defines GK110 as a "desperation move," as you referred to it? That's my strongest point of total, I-don't-understand-what-your-words-mean disagreement.
|
# ? Apr 11, 2014 20:00 |
|
If you want to be a profitable GPU company, your goal is to produce GPUs that are as small as reasonably possible for a given market segment. This allows you to produce more of them per-wafer as well as reducing the impact of defects, lowering your per-GPU cost. Remember that GK110 is a year old now, it absolutely was dogged with yield and availability issues due to its size; even the halo part was die-harvested! That fact that you see general availability now is because TSMC's 28nm process is as mature as you can get and yields are incredibly high considering die size. I described the GK110 as a desperation move when sold in gaming cards because you're basically wasting almost half the transistors on the GTX 780, it was sold as a high-end gaming card because nVidia had to get rid of cards, they didn't have a 20nm successor to GTX 680, and AMD was brute-forcing performance with huge HPC GPUs that were effectively pre-overclocked. I'm certainly not saying that nVidia won't make a successor to GK110, I'm saying that GPU won't be the GTX 880, in the same way GK110 wasn't the GTX 680. nVidia doesn't want to sell gamers an HPC card with most of its capabilities wasted, they want to sell them a gamer card that costs half as much to build and delivers most of the gaming performance, freeing them to sell HPC GPUs to a market that will pay more for them.
|
# ? Apr 12, 2014 07:18 |
|
spasticColon posted:But is crossfire still a wash on AMD cards?
|
# ? Apr 12, 2014 09:43 |
|
Is 1228 a decent over clock on a 780?
|
# ? Apr 12, 2014 19:50 |
|
Don Lapre posted:Is 1228 a decent over clock on a 780? Very very good, actually. Top notch on air, you'd have to go into the BIOS and allow more voltage to get a higher one in most cases. I had a pretty steady 1170s OC on mine and that was a good OC at the time, they're generally high leakage and getting a really high OC on air is impressive so long as you're sure it's stable. On the various overclocking sites, that'd put you among the "elite" of non-modded 780 overclockers. So, yeah, solid! Alereon, I think you're missing nVidia's broader strategy moving forward. That said, I'm as interested as anybody to see what the specifics are on their next highest end card for consumers - I wouldn't be at all surprised to see something similar to Kepler's launch "style" or pattern, introducing a powerful-enough-to-compete smaller card first and then when stock permits moving to the bigger card. I just disagree that it was in any way a desperate move on their part; rather, I think it was a calculated decision and paid off handsomely for them. And it is repeatable, strategically speaking. I may be overestimating TSMC's lithographic prowess once they finally get the node shrink up and running; I can't say for certain. But I think that nVidia's very capable modular approach has yielded great results and I don't see that changing for our little corner of their market any time soon.
|
# ? Apr 12, 2014 20:23 |
|
My 290s don't overclock for poo poo And heat isn't the issue.
|
# ? Apr 12, 2014 22:59 |
Don Lapre posted:Is 1228 a decent over clock on a 780? I got the same exact overclock on 3 660tis and two brands. I wonder if that's just the norm. I got 70 more mhz after changing the bios, and it was worthwhile even on that much slower card. Don't know if it's apples and oranges between that and a 780
|
|
# ? Apr 12, 2014 23:03 |
|
In theory, as time goes on, every 1mhz overclock is worth more that the previous gen.
|
# ? Apr 12, 2014 23:15 |
veedubfreak posted:In theory, as time goes on, every 1mhz overclock is worth more that the previous gen. True. If I had a single 780 I'd be doing it. I'm not sure if my MSI's would get any benefit from it since I'm not using all my TDP as it is before it gets unstable, where the ASUS reached 1228 at 114% TDP and was limited by power. Would be ironic if they both lost stability at 1228 but for completely different reasons.
|
|
# ? Apr 12, 2014 23:24 |
|
I need to put the pt1 bios on my 290s. That bios lets you push drat near 2v. Ok this 780. If anyone would like it for I'd say 450 to cover paypal fees and shipping let me know. If not I'll just return it, it's not worth ebaying because I can't make enough off of it to care. Going to just put my other 2 290s in and hope that ESO and MW:O actually start supporting xfire.
|
# ? Apr 13, 2014 00:24 |
|
My 780 sits on 1228 on air as well.
|
# ? Apr 13, 2014 01:45 |
|
Dogen posted:My 780 sits on 1228 on air as well. MSI Lightning model, iirc?
|
# ? Apr 13, 2014 08:06 |
|
That was my 580, my 780 is an ASUS DCII
|
# ? Apr 13, 2014 14:53 |
|
That's right. Going senile
|
# ? Apr 13, 2014 16:08 |
|
veedubfreak posted:I need to put the pt1 bios on my 290s. That bios lets you push drat near 2v. ACX model, right? Was hoping it had one of the blower coolers on it.
|
# ? Apr 13, 2014 19:08 |
|
Even if the GTX 880 ends up being a 28nm card couldn't it still be 20% faster than the 780 just from the architectural changes? Over a year between kepler and maxwell just seems way too long.
|
# ? Apr 13, 2014 20:19 |
|
The irksome part of the 880, if NV follows the same roadmap as Kepler, is that the 880 will be a mid-range card sold (and priced) as a high-end one.
|
# ? Apr 13, 2014 21:04 |
|
calusari posted:Even if the GTX 880 ends up being a 28nm card couldn't it still be 20% faster than the 780 just from the architectural changes? Over a year between kepler and maxwell just seems way too long. nVidia won't care if the GTX 700 series is out for over a year as long as AMD doesn't release anything new, which they won't. Some price drops to make the GTX 760 closer to 200 dollars and possibly releasing a real 760 Ti for the 250 dollar market makes more sense than trying to make performance Maxwell parts on 28nm. The GTX 780 is another prime candidate for a price drop, it's poor value compared to a R9 290 if you can get one at MSRP.
|
# ? Apr 13, 2014 21:09 |
|
Agreed posted:Alereon, I think you're missing nVidia's broader strategy moving forward.
|
# ? Apr 13, 2014 22:17 |
Jesus Christ guys, we've been getting ripped off. GTX 770's only cost $84.95 direct from China. http://www.aliexpress.com/store/1041810 alarming how much positive feedback
|
|
# ? Apr 14, 2014 01:36 |
|
Ignoarints posted:Jesus Christ guys, we've been getting ripped off. GTX 770's only cost $84.95 direct from China. The heatsinks on those things are hilarious.
|
# ? Apr 14, 2014 02:00 |
|
|
# ? Apr 19, 2024 02:10 |
|
Ignoarints posted:Jesus Christ guys, we've been getting ripped off. GTX 770's only cost $84.95 direct from China. I kind of wonder what they are, actually. Just a low end GTX 620 that's been flashed to make it look like a GTX 770? I guess I shouldn't expect too much more from the country that makes counterfeit eggs.
|
# ? Apr 14, 2014 02:36 |