Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Boar It
Jul 29, 2011

Mesmerizing eyebrows is my specialty

Agreed posted:

This isn't the parts picking thread, and we JUST had a discussion of the best cards for given price points, but yeah it's the GTX 760. Performance does not hold up super well at >1200p resolution, but should you expect it to? It's a very solid card at a great price. You could also get its discontinued predecessor, the 660Ti, which is virtually the same card in performance terms but might be had on eBay or other outlets closer to $200, for an even better deal. Don't get gouged on it, though, you run that risk with cards that people bought thinking they might SLI down the line, one "risk" of that strategy. It's a bad strategy.

Edit: This is why we don't generally like parts picking type discussions in this thread, because there are a bunch of us who sort of mumble and look away when the topic of 780Ti being an insanely overpowered card for most applications comes up and it's stupid to buy one when the 780 and 290 exist (you know who you are, fuckers, and I love you for it). Medium settings? What is this, an orphanage? Jesus christ man turn it up to at least High, it ought to run well enough on High, just... turn some AA down or something.

Translation: you really should stick to the appropriate thread because you're going to get "video card enthusiast" style recommendations here, as you can see from mine, clashing against "save money, here's how" style responses, as you can see from the other responder, and really it's just a hot mess since we don't know what your resolution is, we don't know what kind of settings you're after, etc. - things that are explicitly requested before making a rec in the appropriate thread.

Yeah sorry about that. I noticed the parts picking thread after I made that post. I'll get over there. But thanks for the tips regardless.

Adbot
ADBOT LOVES YOU

Ignoarints
Nov 26, 2010

Agreed posted:

This isn't the parts picking thread, and we JUST had a discussion of the best cards for given price points, but yeah it's the GTX 760. Performance does not hold up super well at >1200p resolution, but should you expect it to? It's a very solid card at a great price. You could also get its discontinued predecessor, the 660Ti, which is virtually the same card in performance terms but might be had on eBay or other outlets closer to $200, for an even better deal. Don't get gouged on it, though, you run that risk with cards that people bought thinking they might SLI down the line, one "risk" of that strategy. It's a bad strategy.

Edit: This is why we don't generally like parts picking type discussions in this thread, because there are a bunch of us who sort of mumble and look away when the topic of 780Ti being an insanely overpowered card for most applications comes up and it's stupid to buy one when the 780 and 290 exist (you know who you are, fuckers, and I love you for it). Medium settings? What is this, an orphanage? Jesus christ man turn it up to at least High, it ought to run well enough on High, just... turn some AA down or something.

Translation: you really should stick to the appropriate thread because you're going to get "video card enthusiast" style recommendations here, as you can see from mine, clashing against "save money, here's how" style responses, as you can see from the other responder, and really it's just a hot mess since we don't know what your resolution is, we don't know what kind of settings you're after, etc. - things that are explicitly requested before making a rec in the appropriate thread.

I'm already spoiled by maxxxing out everything, including global settings that probably don't even do much. I'm ordering a 1440p today and I'm going to be sad when I have to turn things down, especially since I'll probably be at 85hz or more (and fps is king for me). :smith: time for 2x 770s

veedubfreak
Apr 2, 2005

by Smythe
Hey Agreed, what speed were you getting on your normal 780 before you sold it? I'm going to pop the one I got from microcenter in over the weekend and see how far it goes on air and decide if I should get a waterblock for it or not. My 290s don't overclock for poo poo and since I'm going back to single card, just wondering if an overclocked 780 or 290 is the one to keep. I tried 1100 on my core last night and started getting artifacts and driver crashes after just 2-3 games. Temps aren't my issue, the 290 just doesn't overclock for poo poo without giving it tons of voltage.

Ignoarints
Nov 26, 2010

veedubfreak posted:

Hey Agreed, what speed were you getting on your normal 780 before you sold it? I'm going to pop the one I got from microcenter in over the weekend and see how far it goes on air and decide if I should get a waterblock for it or not. My 290s don't overclock for poo poo and since I'm going back to single card, just wondering if an overclocked 780 or 290 is the one to keep. I tried 1100 on my core last night and started getting artifacts and driver crashes after just 2-3 games. Temps aren't my issue, the 290 just doesn't overclock for poo poo without giving it tons of voltage.

(sell the 290's for a 780ti?)

SlayVus
Jul 10, 2009
Grimey Drawer

veedubfreak posted:

Hey Agreed, what speed were you getting on your normal 780 before you sold it? I'm going to pop the one I got from microcenter in over the weekend and see how far it goes on air and decide if I should get a waterblock for it or not. My 290s don't overclock for poo poo and since I'm going back to single card, just wondering if an overclocked 780 or 290 is the one to keep. I tried 1100 on my core last night and started getting artifacts and driver crashes after just 2-3 games. Temps aren't my issue, the 290 just doesn't overclock for poo poo without giving it tons of voltage.

Why not wait for the 6GB versions? Don't you have a high res setup?

ShaneB
Oct 22, 2002


Agreed posted:

Translation: you really should stick to the appropriate thread because you're going to get "video card enthusiast" style recommendations here, as you can see from mine, clashing against "save money, here's how" style responses, as you can see from the other responder, and really it's just a hot mess since we don't know what your resolution is, we don't know what kind of settings you're after, etc. - things that are explicitly requested before making a rec in the appropriate thread.

Honestly anything lower than a GTX760 or R9 270x just doesn't seem like a gaming card at all. For $200-220 that seems perfectly appropriate. I would either be using onboard or drop the $200 on a good card - not much sense going in the middle IMO.

veedubfreak
Apr 2, 2005

by Smythe

SlayVus posted:

Why not wait for the 6GB versions? Don't you have a high res setup?

The only time memory seems to become an issue is in multicard setups. Since no of the games I play support multicards I'm trying to decide which card to keep until the 20nm cards come out.

Basically I'm just trying to minimize my costs by keeping a single card. Just haven't decided which. I can return the 780 within 30 days so that card is kind of a nonfactor if I decide not to keep it.

Ignoarints
Nov 26, 2010

SlayVus posted:

Why not wait for the 6GB versions? Don't you have a high res setup?


Honestly man, the only differences between 2 or 4 gb vram that I've seen (even in single cards) in game benchmarks is when you go 3 monitors wide. I'm becoming more and more convinced that a lot of vram (right now) isn't worth the money.

Just one example http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/

It's even less of a difference in SLI actually, but I haven't come across much testing on that.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Ignoarints posted:

Honestly man, the only differences between 2 or 4 gb vram that I've seen (even in single cards) in game benchmarks is when you go 3 monitors wide. I'm becoming more and more convinced that a lot of vram (right now) isn't worth the money.

Which is what veedub runs. (3x1440p)

Ignoarints
Nov 26, 2010

deimos posted:

Which is what veedub runs. (3x1440p)

Oh, I just figured because 1 video card... well drat heh. Well for anyone else who wants to know I guess

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Agreed posted:

I'd be interested in hearing your reasoning behind this, especially considering that nVidia is absolutely certain at this point that they'll be fighting Intel in the HPC market at the same time and has a certain degree of inertia that forces their hand there (though it was a very clever move, in retrospect, to go from big hot-clocked SMs to SMXes with greater parallelization... though SMXes are a tad batshit in a "well it works fine, stop griping" kind of way). Not to mention it's not in keeping with, well, anything, frankly, that we've seen from either company. But before I say anything further I would prefer, if you don't mind, for you to note your thinking on the matter so we're not talking past each-other.
Sure. The things that make a GPU great for high performance compute make it extremely inefficient for gaming, in terms of both die area (and thus manufacturing and user cost) and power usage. The largest factor is that providing FP64 compute capabilities that are disabled on gaming cards costs a huge number of transistors that could either not be spent (lowering costs and raising clock headroom), or could be spent on things that do improve gaming performance.

The GK104 GPU that powered the GTX 680/770 is a gaming GPU. On the GTX 770 it offers 3.2 TFLOPs of single-precision compute performance, 33.5 GigaPixels/sec of ROP performance, and 134 GigaTexels/sec of texturing performance, at a cost of 3.54 Billion transistors, 294 mm^2 of die area, and 230 watts, with a launch price of $499 on the GTX 680 and $399 on the GTX 770.

The GK110 GPU that powered the GTX 780 is an HPC GPU. On the GTX 780 it offers 3.977 TFLOPs FP32, 41.4 GP/sec, and 160 GT/sec, at a cost of 7 Billion transistors, 561 mm^2 of die area, and 250W, with a launch price of $650.

For nearly twice the cost to nVidia, GK110 does not deliver anywhere near twice the performance of GK104 for gaming applications. Granted the GTX 780 was very heavily die harvested, but that is necessary to allow the GPU to reach yield and power targets and a hypothetical GM100 would face the same neutering for consumer cards. A major point of the Maxwell architecture is efficiency, both power and die-size, so I think a Maxwell-based GTX 880 would look more like a larger GM104 without the transistors dedicated to FP64 and the pins/board area for a memory bus wider than 256-bit. They can use two of those in a GTX 890 for applications where a single GTX 880 isn't enough, and we'll eventually see die-harvested versions of whatever the successor to GK110 is for Teslas.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Alereon posted:

... capabilities that are disabled on gaming cards costs a huge number of transistors that could either not be spent (lowering costs and raising clock headroom), or could be spent on things that do improve gaming performance.
just wanted to point out one thing here: GK110 is 20W higher than GK104 despite having 50% more memory and 2x the transistors. I actually think the extra gig of GDDR5 makes up the bulk of that disparity, but the thing to keep in mind is that the vast majority of those extra transistors in GK110 are just *off*. the FP64 stuff doesn't even get powered on, which means there's no static leakage (which is a huge chunk of your power costs at 28nm). it's not like they're giving up 3.5B transistors that could be powered and actively doing something useful for an app.

edit: also read this http://www.highperformancegraphics.org/previous/www_2012/media/Hot3D/HPG2012_Hot3D_NVIDIA.pdf

Professor Science fucked around with this message at 23:57 on Apr 10, 2014

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Professor Science posted:

just wanted to point out one thing here: GK110 is 20W higher than GK104 despite having 50% more memory and 2x the transistors. I actually think the extra gig of GDDR5 makes up the bulk of that disparity, but the thing to keep in mind is that the vast majority of those extra transistors in GK110 are just *off*. the FP64 stuff doesn't even get powered on, which means there's no static leakage (which is a huge chunk of your power costs at 28nm). it's not like they're giving up 3.5B transistors that could be powered and actively doing something useful for an app.

edit: also read this http://www.highperformancegraphics.org/previous/www_2012/media/Hot3D/HPG2012_Hot3D_NVIDIA.pdf
You're right that the TDP difference is not as significant between GK104 and GK110 as implemented on the GTX 780, but that's primarily because so much of the GTX 780 GPU is disabled. There's a 65W increase from the GTX 770 to the GTX 780 Ti running in 1/24 FP64, and the 780 Ti is actually clocked lower than the GTX 770 and almost identically to the GTX 780. Also, are you sure that the transistors providing FP64 are unpowered on 1/24 FP64 GK110? It's not like there are FP64 SMXs they can completely power-gate, all the SMXs are capable of 1/3 FP64, so while I agree you're obviously not paying for those transistors to be switching I don't think you're getting zero leakage from them. I'll admit I don't know enough about the low-level details of how the chip works to be confident here, though.

Power really is the least of my concerns, though. Die area means yield, cost, and volume penalties. I think gamers would rather spend that area and those transistors on more FP32 SMXs and fixed-function hardware for graphics, and that nVidia knows they get the most cost-efficient product by going that route.

Ignoarints
Nov 26, 2010
Sold my monitor quicker than I though, won't have one until a qnix comes in :smith: at least I can see what I can manage with 1440p with 2gb cards. Spent the time repasting my 660tis finally and a little bit of cable management at least



spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
drat it I would be tempted to get a second ASUS 660Ti used and try SLI but then I remembered that my motherboard only supports Crossfire. :negative:

But is crossfire still a wash on AMD cards? I ask because I do have two Sapphire HD7850 cards with one that I had in my main rig before upgrading to a 660Ti the other which is still in my HDTV rig but I hardly game on my HDTV rig anymore. Or would I get microstuttering in half of my games from frame pacing issues again like I was getting with a single HD7850?

Ignoarints
Nov 26, 2010

spasticColon posted:

drat it I would be tempted to get a second ASUS 660Ti used and try SLI but then I remembered that my motherboard only supports Crossfire. :negative:

But is crossfire still a wash on AMD cards? I ask because I do have two Sapphire HD7850 cards with one that I had in my main rig before upgrading to a 660Ti the other which is still in my HDTV rig but I hardly game on my HDTV rig anymore. Or would I get microstuttering in half of my games from frame pacing issues again like I was getting with a single HD7850?

I got a LITTLE stuttering at max settings in BF4 only. I'd hit anywhere frfrom 90-180(lol I know) fps but would dip well under 60 occasionally. Once I frame limited to my monitor's Hz basically it's smooth as silk.

It seems to go either way with crossfire depending on the situation but generally its not as good. I had crossfire 7870's for like two weeks a while ago and it was really bad.

If you want to SLI sell and swap your mobo :)

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

spasticColon posted:

drat it I would be tempted to get a second ASUS 660Ti used and try SLI but then I remembered that my motherboard only supports Crossfire. :negative:

If you want to enter the realm of janky poo poo, there's been hacks for years to enable SLI on non-sli boards. I've never done it myself.

http://www.overclockers.com/forums/showthread.php?t=638013

Ignoarints
Nov 26, 2010

Zero VGS posted:

If you want to enter the realm of janky poo poo, there's been hacks for years to enable SLI on non-sli boards. I've never done it myself.

http://www.overclockers.com/forums/showthread.php?t=638013

Totally support janky poo poo

(although it might not be worth buying a video card you can't return without knowing it worked first somehow)

Ignoarints fucked around with this message at 15:13 on Apr 11, 2014

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

You're right that the TDP difference is not as significant between GK104 and GK110 as implemented on the GTX 780, but that's primarily because so much of the GTX 780 GPU is disabled. There's a 65W increase from the GTX 770 to the GTX 780 Ti running in 1/24 FP64, and the 780 Ti is actually clocked lower than the GTX 770 and almost identically to the GTX 780. Also, are you sure that the transistors providing FP64 are unpowered on 1/24 FP64 GK110? It's not like there are FP64 SMXs they can completely power-gate, all the SMXs are capable of 1/3 FP64, so while I agree you're obviously not paying for those transistors to be switching I don't think you're getting zero leakage from them. I'll admit I don't know enough about the low-level details of how the chip works to be confident here, though.

Power really is the least of my concerns, though. Die area means yield, cost, and volume penalties. I think gamers would rather spend that area and those transistors on more FP32 SMXs and fixed-function hardware for graphics, and that nVidia knows they get the most cost-efficient product by going that route.

Without getting into teeny tiny transistors and the 3D lithography involved in making 'em tick, I feel confident stating that your concerns as put forward are addressed adequately by

1. The strategy presented in the slides that Prof. Science linked, which favors performance and efficiency over reduction in die area (an issue that has notably affected AMD, especially when looking at the TDP figures for our little "preview" Maxwell cards - they seem with what I'm sort of stupidly calling "speculative certainty" to have initially planned their current crop around a process node shrink, and their transistor density and heat output is adequately explained by the failure of TSMC to meet that demand).

2. The practical reality of GK110 - it works fine, what's the issue? It's big? Dies are pretty big these days. Supply is not a problem. If this would be a problem, one would expect it to have shown itself as a problem by now.*

3. HPC first, gamers a long rear end time later. nVidia knows Pascal is going to be positioned against Intel. They'll have a card to meet it, or they'll fail. That card may not debut in consumer space, but rather follow the same "pure HPC --> workstation --> consumer" path that's been so successful for GK110, but I think referring to the needs of gamers as being a serious consideration is less ... poignant, at this particular juncture, than perhaps it could be. Right now it's tooth and nail in HPC and if Intel loses with Knight's Landing, it probably won't be by much and they're still in the game; it'd be a more devastating loss for nVidia. Games have adapted to what these little supercomputers we call graphics cards are good at for good reasons. It's symbiotic, serendipitous, and also totally necessary. So that works out great for everyone.



*What defines GK110 as a "desperation move," as you referred to it? That's my strongest point of total, I-don't-understand-what-your-words-mean disagreement.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
If you want to be a profitable GPU company, your goal is to produce GPUs that are as small as reasonably possible for a given market segment. This allows you to produce more of them per-wafer as well as reducing the impact of defects, lowering your per-GPU cost. Remember that GK110 is a year old now, it absolutely was dogged with yield and availability issues due to its size; even the halo part was die-harvested! That fact that you see general availability now is because TSMC's 28nm process is as mature as you can get and yields are incredibly high considering die size. I described the GK110 as a desperation move when sold in gaming cards because you're basically wasting almost half the transistors on the GTX 780, it was sold as a high-end gaming card because nVidia had to get rid of cards, they didn't have a 20nm successor to GTX 680, and AMD was brute-forcing performance with huge HPC GPUs that were effectively pre-overclocked.

I'm certainly not saying that nVidia won't make a successor to GK110, I'm saying that GPU won't be the GTX 880, in the same way GK110 wasn't the GTX 680. nVidia doesn't want to sell gamers an HPC card with most of its capabilities wasted, they want to sell them a gamer card that costs half as much to build and delivers most of the gaming performance, freeing them to sell HPC GPUs to a market that will pay more for them.

PC LOAD LETTER
May 23, 2005
WTF?!

spasticColon posted:

But is crossfire still a wash on AMD cards?
Frame pacing has helped lots for the 7xxx cards but the situation still isn't all that great unfortunately and there are still caveats IIRC (ie. keep resolution at or below 1600p, no windowed games, etc.). For the R9 29x cards CF seems to be pretty solid and if not as good as nV's SLI is at least nearly as good. Guru3d does some good benches and compares to the 2 multi card solutions and for a while now their results boil down to, 'yea they're fast, no we can't see a difference between the 2, here are some charts bla bla'.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Is 1228 a decent over clock on a 780?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Don Lapre posted:

Is 1228 a decent over clock on a 780?

Very very good, actually. Top notch on air, you'd have to go into the BIOS and allow more voltage to get a higher one in most cases. I had a pretty steady 1170s OC on mine and that was a good OC at the time, they're generally high leakage and getting a really high OC on air is impressive so long as you're sure it's stable. On the various overclocking sites, that'd put you among the "elite" of non-modded 780 overclockers.

So, yeah, solid!

Alereon, I think you're missing nVidia's broader strategy moving forward. That said, I'm as interested as anybody to see what the specifics are on their next highest end card for consumers - I wouldn't be at all surprised to see something similar to Kepler's launch "style" or pattern, introducing a powerful-enough-to-compete smaller card first and then when stock permits moving to the bigger card. I just disagree that it was in any way a desperate move on their part; rather, I think it was a calculated decision and paid off handsomely for them. And it is repeatable, strategically speaking. I may be overestimating TSMC's lithographic prowess once they finally get the node shrink up and running; I can't say for certain. But I think that nVidia's very capable modular approach has yielded great results and I don't see that changing for our little corner of their market any time soon.

veedubfreak
Apr 2, 2005

by Smythe
My 290s don't overclock for poo poo :( And heat isn't the issue.

Ignoarints
Nov 26, 2010

Don Lapre posted:

Is 1228 a decent over clock on a 780?

I got the same exact overclock on 3 660tis and two brands. I wonder if that's just the norm.

I got 70 more mhz after changing the bios, and it was worthwhile even on that much slower card. Don't know if it's apples and oranges between that and a 780

veedubfreak
Apr 2, 2005

by Smythe
In theory, as time goes on, every 1mhz overclock is worth more that the previous gen.

Ignoarints
Nov 26, 2010

veedubfreak posted:

In theory, as time goes on, every 1mhz overclock is worth more that the previous gen.

True. If I had a single 780 I'd be doing it.

I'm not sure if my MSI's would get any benefit from it since I'm not using all my TDP as it is before it gets unstable, where the ASUS reached 1228 at 114% TDP and was limited by power. Would be ironic if they both lost stability at 1228 but for completely different reasons.

veedubfreak
Apr 2, 2005

by Smythe
I need to put the pt1 bios on my 290s. That bios lets you push drat near 2v.

Ok this 780. If anyone would like it for I'd say 450 to cover paypal fees and shipping let me know. If not I'll just return it, it's not worth ebaying because I can't make enough off of it to care.

Going to just put my other 2 290s in and hope that ESO and MW:O actually start supporting xfire.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
My 780 sits on 1228 on air as well.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Dogen posted:

My 780 sits on 1228 on air as well.

MSI Lightning model, iirc?

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
That was my 580, my 780 is an ASUS DCII

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

That's right. Going senile :cry:

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



veedubfreak posted:

I need to put the pt1 bios on my 290s. That bios lets you push drat near 2v.

Ok this 780. If anyone would like it for I'd say 450 to cover paypal fees and shipping let me know. If not I'll just return it, it's not worth ebaying because I can't make enough off of it to care.

Going to just put my other 2 290s in and hope that ESO and MW:O actually start supporting xfire.

ACX model, right? Was hoping it had one of the blower coolers on it.

calusari
Apr 18, 2013

It's mechanical. Seems to come at regular intervals.
Even if the GTX 880 ends up being a 28nm card couldn't it still be 20% faster than the 780 just from the architectural changes? Over a year between kepler and maxwell just seems way too long.

Wistful of Dollars
Aug 25, 2009

The irksome part of the 880, if NV follows the same roadmap as Kepler, is that the 880 will be a mid-range card sold (and priced) as a high-end one.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

calusari posted:

Even if the GTX 880 ends up being a 28nm card couldn't it still be 20% faster than the 780 just from the architectural changes? Over a year between kepler and maxwell just seems way too long.

nVidia won't care if the GTX 700 series is out for over a year as long as AMD doesn't release anything new, which they won't. Some price drops to make the GTX 760 closer to 200 dollars and possibly releasing a real 760 Ti for the 250 dollar market makes more sense than trying to make performance Maxwell parts on 28nm. The GTX 780 is another prime candidate for a price drop, it's poor value compared to a R9 290 if you can get one at MSRP.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Agreed posted:

Alereon, I think you're missing nVidia's broader strategy moving forward.
I think we can agree to disagree on this, but I'm curious what you mean here about how their strategy differs from what I posted.

Ignoarints
Nov 26, 2010
Jesus Christ guys, we've been getting ripped off. GTX 770's only cost $84.95 direct from China.

http://www.aliexpress.com/store/1041810

:v:

alarming how much positive feedback

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Ignoarints posted:

Jesus Christ guys, we've been getting ripped off. GTX 770's only cost $84.95 direct from China.

http://www.aliexpress.com/store/1041810

:v:

alarming how much positive feedback

The heatsinks on those things are hilarious.

Adbot
ADBOT LOVES YOU

forbidden dialectics
Jul 26, 2005





Ignoarints posted:

Jesus Christ guys, we've been getting ripped off. GTX 770's only cost $84.95 direct from China.

http://www.aliexpress.com/store/1041810

:v:

alarming how much positive feedback

I kind of wonder what they are, actually. Just a low end GTX 620 that's been flashed to make it look like a GTX 770? I guess I shouldn't expect too much more from the country that makes counterfeit eggs.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply