Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Josh Lyman
May 24, 2009


If anyone wants to buy my 560 Ti for SLI or whatever, I've listed it on SA Mart: http://forums.somethingawful.com/showthread.php?threadid=3587289

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

PC LOAD LETTER posted:

Probably because its only ~50w more over the 780GTX, the people/market (ie. gamers, these aren't Firestream Hawaii cards) interested in these cards care more about performance bang vs buck, and TSMC/GF/<insert non-Intel foundry of choice> don't have much in the way of process improvements to offer now or anytime soon. Slap a better cooler on a 290 or 290X and it drops to 20-30w more power vs a 780GTX. The power issue is over blown and mostly FUD at this point. If they did their job properly on designing the card even the temp issue is over blown and FUD. Noise can definitely be a problem for some though. They definitely should've done a better job on the reference cooler any way you look at it.

The rumor mill is still saying mid-late 2014 for TSMC's 20nm chips to roll off the line and that the improvements over their 28nm process won't be all that impressive. At least initially. They've improved their 28nm process over time, I'm sure they'll do the same with their 20nm tech. Thing is I don't see them doing much of it in a timely manner before their next process is supposed to be ready...unless they're going to delay that too.

That would make a 28nm Maxwell reasonable to do for nvidia, but it'd probably also be a relatively hot and power hungry chip vs Kepler on that process.

The HPC Hawaii cards are going to have ungimped compute DP performance which is something that GCN is pretty good at. The performance/watt probably won't be "poor" at all vs. Kepler for those work loads. Power usage hasn't been the issue with AMD getting the HPC guys to use their hardware anyways, its software and developer support.

I don't really disagree with anything you said - my post was mainly aimed at getting to the same conclusion, that AMD is doubling down on gamers because that's where they can make money, and gamers demonstrably care way less about that poo poo than HPC (where they're blocked anyway by much richer and better connected competitors). I think we might be at risk of talking past each other a bit.

I am interested in what makes you think that Maxwell will necessarily be hotter running than Kepler if it's launched on the 28nm process. Considering GK110 comes in at 7.1 billion transistors and in their lowest leakage chips can be run alongside 12GB of 4GB GDDR5 modules in a 225W TDP, I mean, we're getting the leftovers even at the highest levels as gamers. nVidia is very good at efficiency already - why would explicitly engineering toward more efficiency and integrating a CPU on the card itself to operate more effectively in terms of overall system resource utilization make it hot at 28nm but not at 20nm? Aren't we pretty much allowing that a node shrink alone, at this point, doesn't offer the sort of really impressive efficiency boosts that it used to, especially one that isn't introducing anything especially radical?

If so, then doesn't it follow that the first production proven 20nm planar TSMC cards probably aren't going to be some kind of efficiency magic bullet? I'm not expecting the kinds of dramatic results we used to get back in the big old days (... so to speak) until we go from 20nm planar to the weird 16nm FinFET they're cooking up. Even then, it's still in a lot of ways sort of effectively 20nm, at least according to the numbers IBM released. Node shrinks ain't what they used to be, I tells ya. Should be great for efficiency compared to planar but I don't expect a huge performance increase. We've kinda already been to that rodeo with Intel's lithography from Sandy Bridge on. If IBM can barely budge the contact poly pitch and can't move the metal pitch a single nm moving from 20nm planar to 14nm FinFET, I don't see how TSMC is going to work any kind of miracles with a less aggressive, quicker shrink.

On that note, by the way, I fully agree with your view re: TSMC's 20nm shrink and the :raise: timing of even getting it out the door, let alone improvements on it before 16nm FinFET - wouldn't be surprised to be frank if they join the Intel club and just float it at 20nm for a bit and work on that while working out the bugs in their next shrink, they're not the only game in town but they're one of the few and I don't think they're going to lose any GPU business from production issues if history is any indicator, aheh.

Agreed fucked around with this message at 08:12 on Nov 21, 2013

PC LOAD LETTER
May 23, 2005
WTF?!

Agreed posted:

I think we might be at risk of talking past each other a bit....I am interested in what makes you think that Maxwell will necessarily be hotter running than Kepler if it's launched on the 28nm process....why would explicitly engineering toward more efficiency and integrating a CPU on the card itself to operate more effectively in terms of overall system resource utilization make it hot at 28nm but not at 20nm?...Aren't we pretty much allowing that a node shrink alone, at this point, doesn't offer the sort of really impressive efficiency boosts that it used to, especially one that isn't introducing anything especially radical?
Most of what I was responding to was your comments about Hawaii having "poor" performance per watt vs Kepler. Which I would disagree with. Its definitely not quite as good but it also isn't poor either.

Unless nvidia has managed some significant break through(s) in transistor + uarch design (unlikely, I think the low hanging fruit is pretty much gone which is why you're seeing AMD/nvidia push stuff like Mantle, TrueAudio, memory virtualization, ARM cores, etc.) they're going to have to use lots more transistors to get close to the typical performance increases (ie. 30%+) that people have come to expect for a new GPU. Lots more transistors on the same process with similar clocks = more heat/power usage. Heck even if they just do a similar number of transistors but bump the clocks quite a bit and go for a "speed demon" design power usage will shoot up. If it turns out Maxwell is just current Kepler + ARM CPU + memory virtualization than I'd be wrong about the power shooting up by quite a bit but you're also not going to see a large performance increase either.

I don't think, given the rumors, that TSMC's 20nm will be anything special either. But I'd be surprised if that sort of die shrink didn't also knock power usage down to somewhere closer to where Kepler is right now which most people seem to consider "normal" for a high end GPU. I'd be assuming of course that nvidia would be aiming to leave transistors/clocks the same and just do a "simple" shrink. They may not, in which case 20nm Maxwell may still end up having higher power usage and end up "hot" but with more performance.

It might not be a "bad" trade off and even if it was nvidia might do it anyways so long as the GPU/card price is right and the card isn't too noisy. I'm sure they've been watching with interest what AMD has been able to pull off and sell with Hawaii.

Barfolemew
Dec 5, 2011

Non Serviam
I just can't decide what GPU to buy. I will play a lot of Battlefield 4, Assassin's Creed IV: Black Flag, Crysis 3. Also newest GTA and Watch Dogs when they come available. I don't really play older games that much, the "oldest" ones i currently play are; Civ5, xcom, war thunder etc.

I'm trying to decide between 780Ti, R9 290x and HD 7990. Prices for the 7990 are way down so i can get one from germany for little over 500 euros. I'm really tempted about hd 7990 but how are the driver issues? Ive read that it still has some problems.

I don't really care about power consumption or noise level, i'm used to noisy fans anyway and usually play with headphones. Moneywise i can spend between 500-800 euros.

Battlefield 4 multiplayer and AC IV interest me the most right now. 1440p resolution.



My specs are:
Cpu: i5 2500k @ 4.5GHz, Noctua nh-d14
MB: ASRock P67 Pro 3 (cant SLI)
Ram: Vulcan Series Red Dual-Channel kit, DDR3 2133MHz (PC3-17000) CL11
SSD: Samsung EVO 850 250Gb
HDD: WD Black 2 TB
Monitor:QNIX QX2710 2560x1440
Gpu: ?

Thanks for any input.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

woppy71 posted:

Guys, I have £65 to spend on a new graphics card for my PC (I'm a gamer on a low budget) to replace the current GT430 that I currently have installed.

My monitor has a maximum resolution of 1360 x 768 and to be honest, I'm not too worried about playing games at "ultra" settings and I'm more than willing to dial down the settings in order to get playable frame rates.

My current system has a Intel Core 2 Duo processor (not sure of the exact model, but it's the 2500mhz version) and 6GB of RAM

I've read a few reviews on budget graphics cards and one particular model seems to get mentioned quite a lot, namely the Radeon HD7750.

Do you guys think think this would be a reasonable upgrade, baring in mind my expectations, or are there better cards out there for around £65?

Your input would be greatly appreciated :)

You can get Geforce 640 and Radeon 7750 at 65 quid, but the 640 is pretty bollocks, beaten easily by the 7750.

Spend just under 75 you can get a GDDR5 7750 with a passive cooler!

At 75 as I suggested above, you could also go for a Geforce 650, which is basically exactly the same as the 7750. (Although given the choice, I'd personally take the fanless 7750 for the silence).

HalloKitty fucked around with this message at 12:40 on Nov 21, 2013

lethial
Apr 29, 2009

PC LOAD LETTER posted:

Probably because its only ~50w more over the 780GTX, the people/market (ie. gamers, these aren't Firestream Hawaii cards) interested in these cards care more about performance bang vs buck, and TSMC/GF/<insert non-Intel foundry of choice> don't have much in the way of process improvements to offer now or anytime soon. Slap a better cooler on a 290 or 290X and it drops to 20-30w more power vs a 780GTX. The power issue is over blown and mostly FUD at this point. If they did their job properly on designing the card even the temp issue is over blown and FUD. Noise can definitely be a problem for some though. They definitely should've done a better job on the reference cooler any way you look at it.

The rumor mill is still saying mid-late 2014 for TSMC's 20nm chips to roll off the line and that the improvements over their 28nm process won't be all that impressive. At least initially. They've improved their 28nm process over time, I'm sure they'll do the same with their 20nm tech. Thing is I don't see them doing much of it in a timely manner before their next process is supposed to be ready...unless they're going to delay that too.

That would make a 28nm Maxwell reasonable to do for nvidia, but it'd probably also be a relatively hot and power hungry chip vs Kepler on that process.

The HPC Hawaii cards are going to have ungimped compute DP performance which is something that GCN is pretty good at. The performance/watt probably won't be "poor" at all vs. Kepler for those work loads. Power usage hasn't been the issue with AMD getting the HPC guys to use their hardware anyways, its software and developer support.

Not all gamer though. If you are trying to build a compact gaming PC, excess power and heat is a big issue. Also aftermarket coolers for compact pc cases are pretty much out of the question.

I have a falcon NW tiki case so I really hope "next gen" gfx cards will use less power while still providing improved performance.

LASER BEAM DREAM
Nov 3, 2005

Oh, what? So now I suppose you're just going to sit there and pout?
I'm looking at picking up a new Nvidia GPU in the ~$350-400 range. Is the GTX 770 a good choice? Is there anything else I should be looking at?

Edit: This will be backed by an i5-2500K, if that matters.

ClassH
Mar 18, 2008

LASER BEAM DREAM posted:

I'm looking at picking up a new Nvidia GPU in the ~$350-400 range. Is the GTX 770 a good choice? Is there anything else I should be looking at?

Edit: This will be backed by an i5-2500K, if that matters.

If you want to stick to the the bottom of that range I grabbed this one for $329.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814121770

I wanted nvidia and also the 2 of the 3 games that came with it were pretty important. The newest bios for that card ups the clocks a little bit also so take that in mind when you look at reviews.

knox_harrington
Feb 18, 2011

Running no point.

Haeleus posted:

So I want to splurge on a new GPU but the 780ti is a bit too expensive; I'm willing to buy something in the $500s-600 range. I'd like to know how a 780 superclocked ACX would compare in performance to a 290x.

Also, I haven't used an ATI card since the Radeon 9600 back in the day. Besides the heat issue (which is why if I do get a 290x I'll wait for a non-ref design cooler), is there any caveat to going AMD over Nvidia for 1080p gaming like inconsistent performance, driver issues, etc.?

A 780 or 290x is really overkill if you're going to stick at 1080p.

PC LOAD LETTER
May 23, 2005
WTF?!

lethial posted:

If you are trying to build a compact gaming PC, excess power and heat is a big issue.
Sure but that will be true with any high end GPU right now. They all use 200w+ of power and have dual slot coolers. That situation probably won't change with anything top end from nvidia or AMD for years.

thebushcommander
Apr 16, 2004
HAY
GUYS
MAKE
ME A
FUNNY,
I'M TOO
STUPID
TO DO
IT BY
MYSELF
Finally got my new 290X up and running in my PC. Had to get a new monitor and modify the case so the card would fit, but it's in and works. Card runs everything pretty drat awesome even on my first gen i7-890 @ 3.4ghz w/ 16GB ram. Which is still good I guess. I had to override fan controls with afterburner though because I don't think even the newest beta drivers are working correctly. Setup a fan speed curve from 20% @ idle to 57% under load (BF4 maxed out) keeps the card around 88c. It's fairly loud at 57%, but not leaf blower loud. 100% was insane though in testing.

The real point was after getting it all running I realized I don't really care much for PC gaming anymore or at least not enough where I need a 290X.

Agrajag
Jan 21, 2006

gat dang thats hot

thebushcommander posted:

Finally got my new 290X up and running in my PC. Had to get a new monitor and modify the case so the card would fit, but it's in and works. Card runs everything pretty drat awesome even on my first gen i7-890 @ 3.4ghz w/ 16GB ram. Which is still good I guess. I had to override fan controls with afterburner though because I don't think even the newest beta drivers are working correctly. Setup a fan speed curve from 20% @ idle to 57% under load (BF4 maxed out) keeps the card around 88c. It's fairly loud at 57%, but not leaf blower loud. 100% was insane though in testing.

The real point was after getting it all running I realized I don't really care much for PC gaming anymore or at least not enough where I need a 290X.

Is your case smaller than normal or is that card just extraordinarily big?

thebushcommander
Apr 16, 2004
HAY
GUYS
MAKE
ME A
FUNNY,
I'M TOO
STUPID
TO DO
IT BY
MYSELF

Agrajag posted:

Is your case smaller than normal or is that card just extraordinarily big?

I have a Gigabyte 3D Aurora, had it for many years. It's a standard ATX case, but the available space was kind of cramped even before my motherboard barely fit with drives plugged into the SATA ports thanks to an internal HDD cage that wasn't removable. I needed about 2 more inches of room for the 290X to fit so I ended up cutting the HDD cage out and mounting my drives into the 3.5" drive bays above. The 290X is a little over 11" long, with the cage removed I freed up an additional ~6" or so which also allowed for better cable management.

That said, before I probably wouldn't have been able to fit anything bigger than a 400 series Nvidia card. I had a GTX 460 in before and it had maybe 1/2" clearance on the end.

thebushcommander fucked around with this message at 17:06 on Nov 21, 2013

computer parts
Nov 18, 2010

PLEASE CLAP

knox_harrington posted:

A 780 or 290x is really overkill if you're going to stick at 1080p.

On a related note, what would be the most cost efficient card for someone who's only going to have a 1680x1050 monitor?

Blorange
Jan 31, 2007

A wizard did it

computer parts posted:

On a related note, what would be the most cost efficient card for someone who's only going to have a 1680x1050 monitor?

I'd just use recommendations for 1080p, it's only 15% fewer pixels.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

computer parts posted:

On a related note, what would be the most cost efficient card for someone who's only going to have a 1680x1050 monitor?

A 650ti Boost, 660ti, or 7950, whichever you can spot the best deal for. Try eBay, Amazon, and open box at Microcenter (which are marked down but still have full warranty and game codes). If you get used on eBay or Amazon make sure they mention a transferable warranty like EVGA does.

GrizzlyCow
May 30, 2011
These questions should really be directed at the PC Parts thread, but I'd suggest something around a 7850 or 560 Ti or 7870 or R9 270(X) for resolutions not quite 1080p if you can't afford more solid cards like the 7950 or 760/670.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Zero VGS posted:

A 650ti Boost, 660ti, or 7950, whichever you can spot the best deal for. Try eBay, Amazon, and open box at Microcenter (which are marked down but still have full warranty and game codes). If you get used on eBay or Amazon make sure they mention a transferable warranty like EVGA does.

Well, the 7950 is quite a bit faster than either of those two cards, the closest AMD card these days would be the R9 270. The 270 is exactly the same GPU as the 270X, just a bit lower clocked. You could easily overclock the 270 a bit more to match up exactly.

Here's a good review with this ASUS R9 270 (183.98) up against say, a 660 (189.99). Also, that 270 is very quiet.

Just depends how much money you want to spend. As proven time and time again in this thread, sky is the limit!

HalloKitty fucked around with this message at 18:18 on Nov 21, 2013

Fallows
Jan 20, 2005

If he waits long enough he can use his accrued interest from his savings to bring his negative checking balance back into the black.

Agreed posted:

I don't want to say it's impossible to kill your card without doing BIOS modifications, but... It's really, really hard. Cards handle a too-heavy overclock gracefully under all but the most extreme temperature situations. The only way you could actually expect to damage the card is if somehow you put it in a situation where it attempts to draw more power than your PSU can provide. Apart from that, nVidia (and EVGA, for that matter) have been careful in designating the allowed deviance from nominal values in overclocking, sometimes in a rather draconian fashion - GTX 780s, for example, only allow 38mV of overvolting. If Precision X allows you to make the voltage change, it's almost certainly safe. If you push the core too much or the memory too much, you'll driver crash (or, in extreme situations, hard lock or bluescreen and it'll reset to factory on booting back up). Manufacturers are not super keen on the idea of people killing their cards and pointing angry fingers, and EVGA's FTW version should be pretty carefully binned for performance anyway. It's almost certain that you'll run into the hard TDP limit before you can do anything that would damage the card. There are five user exposed protections but several other safeguards to protect the hardware from damage. Overclocking a graphics card has been made much safer than overclocking a CPU, with the trade-off being that you can't go quite as crazy on a GPU since they're already careful in the validation and binning process and don't have much "waste" for you to take advantage of anyway.

I don't want to give you a false sense of untouchable security, it is possible to damage a card by overclocking it, but it's really, really difficult.


Thanks I'm gonna head back over to the Overclock thread and follow the real over clock part of the guide, not the quick one. I stopped my overclock where it was because I just randomly slided them up and everything has worked fine.

TheRationalRedditor posted:

A 670 will run BF4 on ultra everything at 1080p extremely well. Mine definitely does. You should overclock it more.

I see drops down to 38-39 and with most of the fighting around 50 where I am at now. Are you staying at basically 60 on ultra?

thebushcommander
Apr 16, 2004
HAY
GUYS
MAKE
ME A
FUNNY,
I'M TOO
STUPID
TO DO
IT BY
MYSELF
After much thought I decided that I don't need the 290X.. so if anyone is looking to buy one let me know! Gonna toss a post in SA-Mart later. Looking to get $510 shipped (retail is $580, but I used the BF4 code)

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

PC LOAD LETTER posted:

Unless nvidia has managed some significant break through(s) in transistor + uarch design (unlikely, I think the low hanging fruit is pretty much gone which is why you're seeing AMD/nvidia push stuff like Mantle, TrueAudio, memory virtualization, ARM cores, etc.) they're going to have to use lots more transistors to get close to the typical performance increases (ie. 30%+) that people have come to expect for a new GPU. Lots more transistors on the same process with similar clocks = more heat/power usage. Heck even if they just do a similar number of transistors but bump the clocks quite a bit and go for a "speed demon" design power usage will shoot up. If it turns out Maxwell is just current Kepler + ARM CPU + memory virtualization than I'd be wrong about the power shooting up by quite a bit but you're also not going to see a large performance increase either.

Fair enough. I guess we'll see how it pans out; what's your rumor mill source, by the way? That Videocardz site guy was wrong about pretty much everything else he talked about minus the super obvious "They're going to make a better GK110 card," I'm a little hesitant to take his word on the production schedule at TSMC. No 790, the 780Ti isn't Titan Plus and it doesn't cost $1500, etc.; it's just easy to believe that TSMC is having trouble with the shrink or else it'd have been accomplished by now, nVidia's road map had Maxwell launching late in 2013 originally if I recall correctly and the expectation was that 20nm would be production proven by now. Maybe that additional 8nm and the larger efficiency gap that comes with it explains AMD's architecture running so hot - if it was designed for 20nm and they more or less had to launch it at 28nm to start getting stuff out the door (and there are serious indicators in that direction) then I could see them ending up with a hotter-running part than they intended. I genuinely doubt their word when they say that they were aiming at a 95ºC temperature target all along - I believe them when they say it won't kill the card, but the fact that it throttles like crazy once it hits that point and all overclocking goes out the window faster than a 600-series Kepler hitting 82ºC seems to suggest that they're spinning a consequence rather than announcing a choice they wanted to make. nVidia may be in the same position come Maxwell, but I'm reserving judgment until I see it in action.

At least I feel like I can count on nVidia to (be able to afford to?) put a quality cooler on the thing. AMD has focused heavily on software to help cooling, which I'm sure keeps production costs down, but everyone agrees they hosed up by sticking a more or less unmodified 7970-style reference blower on a much hotter chip.


Fallows posted:

Thanks I'm gonna head back over to the Overclock thread and follow the real over clock part of the guide, not the quick one. I stopped my overclock where it was because I just randomly slided them up and everything has worked fine.

I actually wrote a guide for overclocking nVidia cards, and while it applies specifically to Boost 2.0 and its great temperature target adjustment, it still applies to any Kepler card that uses GPU Boost tech at all. And all you have to do with a 670 or 680, honestly, is move the power % slider as far right as it'll go (132% on that unit, I think?) and then balance core offset with memory offset. nVidia's 770 refresh solved two problems with the GTX 680/670. First, it gave them a nice TDP boost to play with; you'll as likely as not run into power limitations around the same time, if not before, you run into limitations on how fast the core and VRAM will overclock. Second, they upped the VRAM to 1750MHz GDDR5. Before that, the 670/680 were bandwidth starved. Which means yours is bandwidth starved. You'll need to balance overclocking the memory with overclocking the core, or else you'll have a core that runs faster than the card can move stuff in and out of memory to accommodate it, or you'll sink all your available power into a too-high memory overclock that doesn't leave room to OC the core. GK104 was a real balancing act before the 700 series and the GTX 770. But you've almost certainly got a lot more performance than you're using at the moment.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

thebushcommander posted:

After much thought I decided that I don't need the 290X.. so if anyone is looking to buy one let me know! Gonna toss a post in SA-Mart later. Looking to get $510 shipped (retail is $580, but I used the BF4 code)

Are you perhaps interested in a 290 sans X?

LASER BEAM DREAM
Nov 3, 2005

Oh, what? So now I suppose you're just going to sit there and pout?

ClassH posted:

If you want to stick to the the bottom of that range I grabbed this one for $329.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814121770

I wanted nvidia and also the 2 of the 3 games that came with it were pretty important. The newest bios for that card ups the clocks a little bit also so take that in mind when you look at reviews.

Thanks, I ended up going with the EVGA SC version of the 770 GTX. It was $350 before tax, but I did a price match with Best Buy where I had a $180 gift card.

lethial
Apr 29, 2009

PC LOAD LETTER posted:

Sure but that will be true with any high end GPU right now. They all use 200w+ of power and have dual slot coolers. That situation probably won't change with anything top end from nvidia or AMD for years.

My system has a GTX780, and I could actually fit a GTX 780 ti if I change my main HDD into a green drive instead of a power hungry hitachi 4TB drive...

I used to be a big fan of raw performance, but having moved many times, and tried to game in a poorly ventilated room in the summer, I am now all about power/noise/performance overall ratio. And the GTX 780 is perfect for me, honestly I am just in love with its reference cooler. Never thought I'd see the day when a reference cooler performs so well.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

LASER BEAM DREAM posted:

Thanks, I ended up going with the EVGA SC version of the 770 GTX. It was $350 before tax, but I did a price match with Best Buy where I had a $180 gift card.

Did you make sure to ask for the vidyagame vouchers? The Bestbuy site doesn't mention the Assassin's Creed or Splinter Cell but it does say all the 770's get Batman. If you forgot I'd go back with the receipt and try to scoop them up if they have em.

LASER BEAM DREAM
Nov 3, 2005

Oh, what? So now I suppose you're just going to sit there and pout?

Zero VGS posted:

Did you make sure to ask for the vidyagame vouchers? The Bestbuy site doesn't mention the Assassin's Creed or Splinter Cell but it does say all the 770's get Batman. If you forgot I'd go back with the receipt and try to scoop them up if they have em.

I actually bought AC4 already. That and Watch Dog's are the main reason I'm upgrading. Gotta have those max settings.

woppy71
Sep 10, 2013

by Ralp

HalloKitty posted:

You can get Geforce 640 and Radeon 7750 at 65 quid, but the 640 is pretty bollocks, beaten easily by the 7750.

Spend just under 75 you can get a GDDR5 7750 with a passive cooler!

At 75 as I suggested above, you could also go for a Geforce 650, which is basically exactly the same as the 7750. (Although given the choice, I'd personally take the fanless 7750 for the silence).

Thanks for the info and links :) That AnandTech site looks pretty interesting, I'll take a closer look at that.

Luckily, noise isn't an issue for me, so the need for a passive cooled option isn't high on my wish list, an actively cooled card would be fine for me. I was looking at this http://www.amazon.co.uk/gp/product/B009X4J8Y8/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=A3P5ROKL5A1OLE as a possible purchase. Is the XFX brand a reputable one?

Fallows
Jan 20, 2005

If he waits long enough he can use his accrued interest from his savings to bring his negative checking balance back into the black.

Agreed posted:


I actually wrote a guide for overclocking nVidia cards, and while it applies specifically to Boost 2.0 and its great temperature target adjustment, it still applies to any Kepler card that uses GPU Boost tech at all. And all you have to do with a 670 or 680, honestly, is move the power % slider as far right as it'll go (132% on that unit, I think?) and then balance core offset with memory offset. nVidia's 770 refresh solved two problems with the GTX 680/670. First, it gave them a nice TDP boost to play with; you'll as likely as not run into power limitations around the same time, if not before, you run into limitations on how fast the core and VRAM will overclock. Second, they upped the VRAM to 1750MHz GDDR5. Before that, the 670/680 were bandwidth starved. Which means yours is bandwidth starved. You'll need to balance overclocking the memory with overclocking the core, or else you'll have a core that runs faster than the card can move stuff in and out of memory to accommodate it, or you'll sink all your available power into a too-high memory overclock that doesn't leave room to OC the core. GK104 was a real balancing act before the 700 series and the GTX 770. But you've almost certainly got a lot more performance than you're using at the moment.

In EVGA Precision X the power slider goes up to 145%, should I still crank it all the way up?

I went to 120% and did 35/57 gpu/mem offsets so far in BF4 I haven't gone under 50, it's much better. Later tonight Im gonna try real stress tests(that unreal one) but if it's stable for 4 hour battlefield sessions that's good enough though basically right?

Athropos
May 4, 2004

"Skeletons are Number One! Flesh just slows you down."
The only bad thing about the free rear end Creed 4 and Batman and whatnot is that they shipped the small cards with the serial # on them separately from the video card through purolator, in padded envelopes. Got my 780ti, still waiting on some rear end Creed.

craig588
Nov 19, 2005

by Nyc_Tattoo

Fallows posted:

In EVGA Precision X the power slider goes up to 145%, should I still crank it all the way up?

Yes. The power target maximum is set by the manufacturer so you're not really overclocking anything by maxing it out, all you need to do is make sure the temperatures aren't getting too crazy because it does result in a lot more heat because of the much higher average loaded clock speeds. Also, the percentage value is arbitrary, one manufacturers 110% could be higher than anothers 200%.

thebushcommander
Apr 16, 2004
HAY
GUYS
MAKE
ME A
FUNNY,
I'M TOO
STUPID
TO DO
IT BY
MYSELF

deimos posted:

Are you perhaps interested in a 290 sans X?

Probably not, even a 7970 would be overkill for the amount of gaming I do on this thing.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Fallows posted:

In EVGA Precision X the power slider goes up to 145%, should I still crank it all the way up?

I went to 120% and did 35/57 gpu/mem offsets so far in BF4 I haven't gone under 50, it's much better. Later tonight Im gonna try real stress tests(that unreal one) but if it's stable for 4 hour battlefield sessions that's good enough though basically right?

Hot drat 145% that's fantastic, my 680 only had 132% or 135% (FactoryFactory could probably tell you more). You should have plenty of power. Slide all the way to the right, proceed to overclock your heart out. Make a custom fan profile that ramps the fan up to full before it hits 75ºC, as it begins throttling at 82ºC. Balance VRAM with Core OC, shoot for around 1200MHz on the core if you want to go for broke.




Byyyyyyy the way, y'all, the 780Ti is INCREDIBLE :pcgaming:

I'm in like the top 4% of all scores worldwide on epeen measuring software. And videogames, holy crap is it good at them. I'm using a fairly modest overclock, too, this is just a golden chip. I reckon most 780Ti GK110s will be great overclockers, just variations on "great." They were the chips closest to becoming the highest end Quadro card this generation, after all.

I am personally sparing as much available TDP (all 106% of it... come onnnn nVidia) for the core, so just running the VRAM at 1850MHz but it's rock solid stable. Base clock at 1136MHz, boost clock at 1202MHz, and it'll boost as high as 1260MHz in games. 7400MHz effective VRAM clock. I don't think it needs more, but it will actually run at the 106% power target I set for it, which is nice. I think it's probably got more headroom than this, I'll play around with it, but at this clock I'm already quite satisfied with the level of performance I'm seeing.

54.5 GPixels/sec, 272.6 GTexels/sec, 355.2 GB/sec memory bandwidth.

God drat. And cards are just going to get more powerful than this over time, too. :holy:

Agreed fucked around with this message at 01:25 on Nov 22, 2013

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."
But...how are you going to handle all the brownouts in your neighborhood? :ohdear:

Glad that it came in, though! Heck, I think we all have our cards now except for my buddy. I suppose that's USPS for ya, though. :shrug:

I'm kind envious of being able to just stick the card in and have it work. I miss that. The only remaining issue with mine is the hardlocked black screen I came woke up to this morning. The issues people have been having with those seems to be tied with the Elpida memory vendor (the cards with Hynix memory has been free of these issues). Word is from an AMD rep that there're going to be drivers to fix that, but I'm dubious that it's something a new driver can fix. We'll see in the coming days.

The highest temp I've hit on my GPU after some intense gaming was 65c at 1 Ghz (stock is 947). So I may try to see how far I can push things, although I've always been more reluctant to test a GPU's limits.

Oh by the way:

60.6 GPixels/sec, 151.5 GTexels/sec, 320.0 GB/sec memory bandwidth at 1 Ghz.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Ghostpilot posted:

Glad that it came in, though! Heck, I think we all have our cards now except for my buddy. I suppose that's USPS for ya, though. :shrug:

I don't have Agreed's 780 yet. Was there supposed to be a tracking number I was supposed to get? It's okay, I believe in him. I believe in the card coming tomorrow :pray:

As for Ghostpilot's acquaintance, I've been staring at the tracking info telling me almost nothing about where it is right now :negative:

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Sidesaddle Cavalry posted:

I don't have Agreed's 780 yet. Was there supposed to be a tracking number I was supposed to get? It's okay, I believe in him. I believe in the card coming tomorrow :pray:

As for Ghostpilot's acquaintance, I've been staring at the tracking info telling me almost nothing about where it is right now :negative:

Yeah, that's been my luck with USPS as well. Actually, your luck has been better than what I've ever had with USPS tracking: I've never gotten one of their tracking numbers to work. Just funny of them to schedule a day, have the packaged tracked, and still be off by (so far) 2 days. Nothing against you, though, that's just how it goes with USPS sometimes. Tomorrow will probably be the day, though!

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Today, I learned that Gamestop sells GPUs :psyduck:

JerikTelorian
Jan 19, 2007



I'm currently running with SLI'd GTX460's (768mb each). Performance is mediocre in AC4 (Averaging 30fps). ARMA 3 tonight was chugging hard (16FPS, though that might have been network stuff)

Would an upgrade to a 660 benefit me as much as I'm imagining? EVGA has two 660 models; one with 2G and one with 3G. Are the differences between 2/3g substantial or should I save a few bucks and go with the cheaper one?

Other stats: i7-860 CPU; 16GB RAM.

JerikTelorian fucked around with this message at 04:14 on Nov 22, 2013

PC LOAD LETTER
May 23, 2005
WTF?!

Agreed posted:

what's your rumor mill source, by the way?...I genuinely doubt their word when they say that they were aiming at a 95ºC temperature target all along
I've watched a number of sites for rumors/info. over the years, some of the best ones are gone now (ie. Aces *sniff*) unfortunately. RWT and B3D are my main go to for GPU/CPU info/leaks. Interesting stuff still occasionally pops up from time to time on some investor boards I look at, usenet (still!), TechReport, Anandtech, WCFtech, vrzone, and [H]. Its nothing like the "old" days (ie. 2003 and prior), people are fairly tight lipped now that NDA's are being enforced seriously. Hawaii originally designed for 20nm makes a lot of sense to me too. AMD must've had a 28nm version in the works the whole time though as a "Plan B" if it looked like TSMC couldn't deliver on time. A 28nm "Plan B" Hawaii designed to run hot is believable and the HSF situation would make sense if they hadn't seriously planned to ship a 28nm Hawaii until very recently. Perhaps they wanted to get something out for the holidays and weren't sure if a redesigned HSF would make it in time too?

That is all guess work though.

While their reference cooler could've been lots better I think the throttling issue is over blown too. There are lots of people who actually own the card, and some reviewers, who've reported little or no issues with throttling with the default fan cap on the 290 or uber mode on the 290X. Most of the complaints about the heat/power/throttling issue seems to be coming from people who don't even own the card.


lethial posted:

My system has a GTX780, and I could actually fit a GTX 780 ti if I change my main HDD into a green drive
If you could run a 780Ti you could run a 290 or 290X in that case. At least as far as dealing with the heat/power usage go. If you hate the noise the AMD reference HSF makes and either can't fit a aftermarket HSF in your case or simply don't want to deal with mounting one then fair enough. They can't please everyone.

PC LOAD LETTER fucked around with this message at 04:27 on Nov 22, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Sidesaddle Cavalry posted:

I don't have Agreed's 780 yet. Was there supposed to be a tracking number I was supposed to get? It's okay, I believe in him. I believe in the card coming tomorrow :pray:

As for Ghostpilot's acquaintance, I've been staring at the tracking info telling me almost nothing about where it is right now :negative:

As embarrassing as it is, my wife just hasn't had time to get the receipt from the post office to me. But it shipped out yesterday, insured, and ought to be showing up if not tomorrow then I'd certainly think no later than Saturday - regardless, I'll politely insist that she dig around and find the receipt so I can get you tracking, that's my bad.

I... I gave you my Game of Pwns poster, though. Cherish that goofy loving thing. Cherish it.

Adbot
ADBOT LOVES YOU

Bloody Hedgehog
Dec 12, 2003

💥💥🤯💥💥
Gotta nuke something

JerikTelorian posted:

I'm currently running with SLI'd GTX460's (768mb each). Performance is mediocre in AC4 (Averaging 30fps). ARMA 3 tonight was chugging hard (16FPS, though that might have been network stuff)

Would an upgrade to a 660 benefit me as much as I'm imagining? EVGA has two 660 models; one with 2G and one with 3G. Are the differences between 2/3g substantial or should I save a few bucks and go with the cheaper one?

Other stats: i7-860 CPU; 16GB RAM.

If you're upgrading for AC4, then no, there wouldn't be much benefit. AC4 is horribly optimized, and even top-end video cards and dual-card setups aren't able to really brute-force through the slowdown in certain areas of the game. The foliage in that game is a killer. And ARMA..... well, ARMA is it's own weird beast. More powerful cards can help, but then you just get weird matches where performance goes in the toilet for no apparent reason.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply