Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
It'll be cool if you can get an external GPU for an ultrabook or something so you can game at home. Well worth the premium, or at least I'd pay it :shobon:

Adbot
ADBOT LOVES YOU

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
The 660Ti is looking really, really nice, but TweakTown is obnoxiously blunt about their dislike for Nvidia, and they bring it up in every single review, multiple times, over and over. Apparently, they're upset because they aren't being sent information about products or review units, and they have the gall to call Nvidia unprofessional while they sit and whine like children. Ugh. I wish it were Anandtech, their reviews are so much better :(

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
I hadn't even looked at that. The only intensive games they have are Metro2033, Just Cause 2, and DiRT3. I mean, jeez, gaming bookmarks aren't any better than synthetic bookmarks when 2/3 of your lineup came out two generations past.

Furthermore, it seems this spat with Nvidia has been going on for years, and they have continued to make snide comments and work in multiple jabs on a page for the entire time. I'm a little skeptical of their review now; it seems almost as many words are spent badmouthing the producer as are reviewing the actual product.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
Even so, to quote the article, it's "88% of the performance [of the 670] for 75% of the price". It's *still* a drat good card.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

MixMasterMalaria posted:

I'm looking at that one too, I don't like the blower cooler but the warranty is unparalleled and transferable. I'm still rocking a 4770 card with a sandy bridge cpu so this would be one hell of an upgrade. But.. My... budget... :ughh:

Get a 560Ti, then. It still rocks the socks of any 1080p display and is right around the $200 mark right now, and often hits $180-$190 on sale.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
Try underclocking the card using MSI Afterburner or a similar utility. If that solves the issues, then the problem is the card. If it doesn't, then the problem is the drivers, which is a much more difficult problem to fix.

My money, however, is on the card.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
The only other things I can think of would be to clean up the drivers for your old card. Other than that, try those beta drivers.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
A cursory google search would suggest that you're still hosed, since a 690 is just SLI in a single PCIe slot.

Anyway, a 680 beats a 5970 by 15% at the thinnest margin and 100% in several AAA titles released this year. You read lovely reviews. Seriously, it gets 60+ FPS at 2560x1600 on every modern game. You will literally see exactly 0 perceptible benefit from spending an extra $500.

edit: hold on, if you have an AMD motherboard then your problems are much, much deeper than a graphics card. You either have a now-dated Phenom or you're running Bulldozer, which is lovely and will probably bottleneck you way before the graphics card.

InstantInfidel fucked around with this message at 03:34 on Aug 27, 2012

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
I dunno, I could only find one result from a forums post of Tom's hardware saying that the card would need SLI support.

However, the SLI Wikipedia entry seems to disagree and claims that the motherboard doesn't need to support SLI because everything is on the chip, which supports what you said and is definitely counter-intuitive :colbert:

The point remains that the poster is trying to light money on fire and would be better served by buying a new motherboard and processor to go with his 680, or by giving it to charity.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
Wait, what? 30 FPS? Your 5970 should be able to do that, according to the benchmarks I linked. Seriously. You *might* have to turn the settings down a single notch, from Extreme to Ultra/Very High, on a select few games that have lovely optimization (I'm looking at you, every Batman game made). The 680 will handle that with no issue, and the 670 would be able to as well, on most games.

Also, again using the same benchmark suite, Anandtech would suggest that in a CPU-limited scenario (one where the GPU is irrelevant), your current processor is, at a minimum, 25% and at a maximum 50% behind an i5-2500k, which itself is about 10% behind a 3570k, the current SH/SC recommendation.

Get a 680, a nice Z77 board, and a 3570k. I really doubt anyone here will recommend otherwise, and if they do, it'd probably be to suggest a 670.

Glen Goobersmooches posted:

Yes, of course a single 680 can do that. They are a lot more powerful than the 5970. A 690 is overkill for basically anything unless you're building one of those hilariously dumb 3D gaming XXXperience cubes with like 4 monitors surrounding a replica Jean Luc Picard's captain chair

See, I thought that too, but according to those benchmarks (I just love benchmarks and would probably do horrible things to get a job at Anandtech) it actually beats a 6970, and is surprisingly close to a 680, especially given that it's 2 generations older.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

The Lord Bude posted:

I'd be interested to see any other data from other tests beyond that report. They only tested 3 games, all at 1920x1080 rather than 2560x1440. I suspect the difference would be slimmer at the higher resolution. BF3 is easily the most demanding graphically of the 3 games they tested and it in particular showed virtually no difference between CPUs. Skyrim, on the other hand is well known as pretty much the most CPU limited game in recent memory, and should never have been used in that test simply because it is such an anomaly.

I'd like to see more data, and particularly more data from a much wider array of games, before I put too much stock in what they are saying.

If I was going to replace my cpu/mobo/ram - do you think a reinstall of windows would be called for? I replaced my hard drive only a few months ago, so I certainly wouldn't be replacing that.

Battlefield on Ultra is less demanding than Skyrim, and both run at better than 60 FPS on 2560x1600 with the highest preset. Also, Skyrim isn't even especially rigorous on a CPU. In the report Agreed linked, they even test it- using 4-year old dual core CPUs, it still manages to break 50 FPS in a GPU-unlimited situation. Performance scales very accurately with resolution- a setup won't have a 25% margin between 2 games and then magically reduce that margin by 10% when you double the resolution. It'll still be around 25%, assuming you haven't hit some insurmountable wall (like using HD4000 on a 1440p monitor, which would produce outliers).

Seriously. Spend that same $1000 and get a 2500k, and a Z77 mobo. If you buy a 690 as-is, then you're going to be lighting all of your money on fire, since your CPU is kicking you in the balls while your monitor laughs.

edit: Agreed always makes better posts than i can :(

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

The Lord Bude posted:

I'm very happy to accept that there is a significant performance difference between CPUs, I'm just not convinced that that will translate into a significant performance difference in most games, skyrim notwithstanding.

Jesus, you've got at least 3 people telling you that it will and you still don't believe us?

Look, let's be blunt: your processor is a relic piece of poo poo at this point. it's your bottleneck, not the GPU. If you want to spend $1000 and not see any improvement at all, then go do it and stop trolling the thread.

edit: and don't go buy a goddamn 680 until you upgrade your CPU and see the difference. I suggested that before I realized you were using an AMD processor. Upgrade your CPU and motherboard, and only then should you consider buying a 680.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

The Lord Bude posted:

I was never going to spend $1549. I was waiting for a 690 to hit $1k. I'd need about $1600 to replace cpu-mobo-cooler-ram-gpu all at once. Either way though, I'm still going to wait till next year.

What? The i5 is $220, a good Z77 is $175, RAM is $30 (and you probably wouldn't need to replace it), a good PSU is $80, and a nice, new case is $100, tops. That comes out to just around $605; you might not need to replace your case if you're happy with what you have (but you *need* to replace your PSU if it's 4+ years old). Your current GPU might meet your expectations, but even if it doesn't, the 670 is $400 and that still puts you at just over $1000, and it'll handle 1600p just fine.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
Which slot did you put it in on your motherboard? If I'm remembering right, you always want to start with the highest (the one closest to the CPU on most boards) as it can actually provide the 16x bandwidth.

However, it appears that your MOBO, having it in either of the two 3.0 slots should give 16x bandwidth, so that might just be GPUz being lovely.

edit: did you really spend $240 on a motherboard?

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
Intel has had demo Haswell chips out for some time now, they're been at a couple of big shows. They're not anywhere near consumer-ready yet, but the chip exists and it works as well as they say it does, so the comparison is quite a bit more valid.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
Anandtech just put up their Alienware M18X R2 review, and if you're the kind of person who really, really wants SLI'd 680Ms then it's the machine for you. I didn't realize just how beefy those things are: they're essentially 670's with a lower clock, and there are two of them. In a laptop. :drat:

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
EVGA, ASUS, and Sapphire usually make top-notch products.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
I had that same problem. I had to system restore to a point a few weeks before and that solved my issues; they haven't come back since.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

Klyith posted:

There's nothing wrong with OCZ power supplies, their high-end stuff have the same guts as PC Power & Cooling units (because OCZ bought them). The cheap ones are not as good, but cheapo PSUs are crappy no matter what badge is on the case. I'm not saying OCZ is a company that people should buy any product from, but if one were doing so, an OCZ PSU isn't bad.
A psu would also have to be impressively hosed up, not just cheap trash, to permanently damage components.

It's hard to say why Stumpus's computer is running badly because "slow as poo poo" isn't very descriptive, but on a box you haven't used for a few years the BIOS battery might have run down and reset to safe defaults. Aside from that, it's probably just the OS being full of old crap and a reinstall would be snappy again.

Not only are OCZ's PSUs poorly built (like everything else OCZ produces), but a bad PSU is pretty much the only component of your system that can ruin other parts when it fails. PC P&C was a good company, until OCZ bought them.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
A relatively common failure in a PSU (I don't know statistics on this, but I've seen this one often) is a capacitor blowing. During regular operation, the resistance of a capacitor is pretty low. When it explodes (not literally, although it can cause a big discharge that looks like an explosion), it instantaneously becomes a gap in the circuit. Immediately after that, and for all intents at the same time, the current flow is continuing. In the process, the resistance of the circuit has increased immensely, and is approaching infinity very quickly, since there's not actually a circuit any more. When the current realizes that "Oops, something's hosed", a massive voltage (called a back EMF) is created and is instantaneously sent through the circuit. Keep in mind this all happens in a matter of a few thousandths of a second and most of that happens at virtually the same instant. The end result is the same: anything currently drawing power gets a huge voltage spike, and anything drawing power in a PC system is probably not rated for the couple million volts it potentially just experienced.

So yeah, it might survive and not blow up your entire system. But is your $300 video card, $130 motherboard, and $220 CPU worth putting at any perceptible risk in order to save $20 on your PSU? Probably not.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
That's already a sub-par piece of tech. The longer it's in use, the greater the chance of failure. It might not ever be a problem, or it could blow up tomorrow.

Anyway, let's end this derail here. This is the GPU thread, not the PSU thread.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

Magic Underwear posted:

They are still running at 1080p, after all.

Uh, no. Games that are developed to run at 1080p will do so, but the new consoles will support 4K. So yeah, you'd be an idiot to buy a 2GB high-end card today.

edit: Nevermind on the PS4, only the Xbox One will support 4K games.

InstantInfidel fucked around with this message at 21:52 on Jun 9, 2013

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

zylche posted:

That's because they're using HDMI 1.4a which supports 4K at 30hz, not because the system is powerful enough to handle a game rendered at that size.


Disgustipated posted:

The PS4 has a GPU that's 50% faster than the Xbone's. The PS4 supports 4K output, they're just not pretending it's powerful enough to actually play games at that res. No way we see 4K games this generation, it'll be a loving miracle if all games run at 1080p.


HalloKitty posted:

No games will run in 4K unless they are extremely simplistic ones. The PS4 has a far faster GPU than the Xbox One, for a start.

But we're still talking about the PS4 being somewhere between a 7850 and a 7870, and the Xbox One being something more like a 7790++

The Xbox 360, technology from 2005, manages to run Skyrim at 1080p at 30FPS. It's also silly and incorrect to expect hardware to perform identically on two things as different as a PC and a game console. A 7790 that doesn't have to worry about the background bullshit from Windows and hacked-together drivers and can put a lot more of its GPU power towards being, you know, a GPU.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

Jan posted:

As mentioned, Skyrim (like most games on 360) renders at 720p and the console upsamples it to 1080p. Not to mention it's not a GPU intensive game. On a mid to high range PC on vanilla Skyrim, you're far more likely to be bottlenecked by the CPU when raising shadow settings to Ultra. With lower shadow settings, you're more likely to be GPU bound but will still have high frame rates even with everything else dialed up.

What I want to say is that with the hardware not even having been released yet, people are already trying to decide what it can or cannot do. Let's wait and let it be released before we all jump to conclusions.

edit: Also, my point about the 360 and Skyrim is less its resolution and more that it'd be really loving hard to run that game on a PC from 2005 using an equivalent graphics chip.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
brb downloading those drivers and overclocking my 680 because i want a 780

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

Magic Underwear posted:

Actually no, about this we know for sure. You were claiming PS4 games could run at 4k. Sony has said that while the PS4 can output 4k video, games can't. Neither xbone nor PS4 will ever run a game at 4k.

That's not what I said, try again:

InstantInfidel posted:

Uh, no. Games that are developed to run at 1080p will do so, but the new consoles will support 4K. So yeah, you'd be an idiot to buy a 2GB high-end card today.

edit: Nevermind on the PS4, only the Xbox One will support 4K games.

The Xbox One will support 4K games, but I wouldn't expect them to appear in quantity for a couple of years.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
I'm having more and more trouble finding reasons not to upgrade to a 780, for almost no reason aside from that I want one. Now, I could justify it, but I have a 680, and it's not giving me any problems (and overclocks like a champ). I feel like my ego is going to win and my wallet is going to lose. :getin:

e: should've read the new posts, ^^^ So I hear you want to buy a 680...

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

Factory Factory posted:

With all these cards being sold, I could've gone from 6850 CF to 680 SLI for the same cost as swapping to a Prodigy and a single 680.

Talk about overkill for 1920x1200 :getin:

poo poo, I'm only on 1080p on a single monitor. Overkill is like :catdrugs: for me.


Agreed posted:

Be sensible! or not these things are loving awesome

See, I just know that if I do this, it's a slippery slope to a custom waterblock setup for my CPU and GPU so I can get a no-noise and barely-stable overclock rig.

e: I just checked and I'm a week past EVGA's step-up program. poo poo.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
Try going to the BIOS and changing the audio output settings. If your card has an HDMI out, I'd bet the problem is that it's trying to send sound through that instead of through the 3.5mm jacks.

Adbot
ADBOT LOVES YOU

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

TheRationalRedditor posted:

Not sure why you posted this here but I've had the 1500s for like 18 months and they're one of the best things I've ever put on my head. If they ever break, I'll just get another.

The guy above him asked for a recommendation on good cans too.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply