Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Aws
Dec 5, 2005


It's replacing my 5870. This is totally irresponsible of me, but I waited about a week to let the impulse purchase desire pass, so it feels kind of responsible. Also, I need it for loading high res texture packs and running at 60 FPS, so it's not really optional :pcgaming:

Saving for a rainy day? What?

I'm thinking scented candles and some romantic music while I unbox and install. Any suggestions for music? What do you guys usually play when a sexy new graphics card arrives?

Adbot
ADBOT LOVES YOU

LRADIKAL
Jun 10, 2001

Fun Shoe
Just a tip for those that have upgraded recently and maybe not been impressed. Try turning on AA in your favorite games if you haven't already. FXAA seems to be free in World of Tanks with my config. Almost looks dreamy!

Arzachel
May 12, 2012
It sucks that there are only two GPUs this gen that are worth a drat, HD7850 and GTX670 :( While both of them are pretty great, I expected such deals in every price range. I guess Canary Islands and GK110 aren't that far off now.

spasticColon posted:

I overclocked my HD7850 to 1GHz and ran the Heaven Benchmark 3.0 on it for about an hour without issues. The GPU only got to 61C under load at 1GHz too. :stare::fh:

Seen some people strapping huge coolers to 7850s and running them at 1300 mhz core. Seems like the only thing limiting even higher clocks is voltage throttling at around 1.3v or so.

Arzachel fucked around with this message at 18:16 on May 26, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Arzachel posted:

It sucks that there are only two GPUs this gen that are worth a drat, HD7850 and GTX670 :( While both of them are pretty great, I expected such deals in every price range. I guess Canary Islands and GK110 aren't that far off now.

Do you mean HD 7950? The 7850 is a strong performer if you want "last gen performance++" but it's not in the top SKU, I wouldn't call it the great card from ATI this gen...

It's pretty normal for there to be the "top ender" then the "closer to price:performance curve," we just haven't had a real update on nVidia's side for years. They Fermi'd the poo poo out of the graphics card market. And ATI did ~basically the same thing, process tweaks rather than huge architectural shifts.

This is a new generation of cards. Look, averaging over 30% performance increase is impressive as hell, these are great cards. The value units from ATI are coming in strong, too, they'll end up filling the same role that, say, the 6850s did in the last generation.

I don't know WHAT to expect from GK110. My crystal ball is currently out for repair, but nVidia is strong on branding and if the performance improvement is commensurate with transistor count then that will be, like, the 700-generation card... Or we'll have another "GTX 285 and GTX 260 Core 216" situation where they bring more powerful hardware to market under the same generational name in order to compete with (what will hopefully be) an even stronger contender performance-wise from ATI.

What happened here that usually doesn't is that nVidia looked at pricing and said gently caress IT, WE WIN. ATI had a few months (unfortunately plagued by supply issues) of total market dominance, did a pretty good job in my opinion of managing to balance price of supply to maximize the revenue from what they could put out since demand was rabid... But once the 680 dropped, so did the sky, and nobody really talks about the 7970 in reverent tones anymore even though it overclocks like a motherfucker too and it's got some really fancy stuff of its own going on. The 680 just kinda marginalizes it, clock-for-clock discrepancy in most games be damned.

Plus, if I recall correctly, the 7970 still works better in Metro 2033! ... :smith:

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
No, he's right, 7850 and 670 are the cards of the moment, they are the best value by far at their respective price points. You get a 670 if you have the money, the 7850 if you want to spend a reasonable amount on a card.

Arzachel
May 12, 2012

Agreed posted:

Do you mean HD 7950? The 7850 is a strong performer if you want "last gen performance++" but it's not in the top SKU, I wouldn't call it the great card from ATI this gen...

The 7850 is this generations GTX460, except everybody ignored it at first do to overclocking being locked at 1050 mhz core maximum for a while. Up to 50% overclocks limited by voltage throttling isn't anything I've seen before. It trades blows with the GTX580 just by bumping the core slider as far as it goes.


quote:

I don't know WHAT to expect from GK110. My crystal ball is currently out for repair, but nVidia is strong on branding and if the performance improvement is commensurate with transistor count then that will be, like, the 700-generation card... Or we'll have another "GTX 285 and GTX 260 Core 216" situation where they bring more powerful hardware to market under the same generational name in order to compete with (what will hopefully be) an even stronger contender performance-wise from ATI.

The earlier launch ended up hurting AMD as much as helping them. They traded clockspeeds for yields which shows in the huge headroom most cards have. Nvidia stripping GK104 of compute brunt instead focusing on gaming was also quite a curveball. I'm guessing Nvidia is going to do the same with the 700 series, with GK114 taking up most of the high end and GK110 being mostly delegated to compute.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

In terms of price:performance, I guess that's true. But the 7850 really is "last gen++," with the exception of superior DX11 performance and a more reliable Civ V bench :v:.

Bonus points for the 7850 for overclocking headroom, since you could usually get a 6950 to at least 900mhz, and reasonably lucky people into the 960mhz region, while the 7850 seems like it can push to 1000mhz pretty reliably and usually go higher with good cooling, if you're willing to risk the voltage :)

But I don't see it as a performance contender - the 670 might as well friggin' be a 680, the performance discrepancy is so small.

Edit: Oh, hold on, the 7850 was artificially choked for overclocking and it turns out you can get more than that out of it? Reliably, or "the good cards?"

And I do think that it's possible nVidia took a lesson from Fermi not to cross the streams and may indeed relegate big Kepler to compute, it was a strange choice to build a card that has far more compute than a gamer will ever require - potentially never even use unless they're folding@home or something - and just accept the problems of making that high end compute chip also work for high end graphics. There's room for a separation there, and with the way big Kepler is designed, they could probably run it for some time as Quadro/Tesla workstation units, ignoring the gaming consumer market entirely.

Agreed fucked around with this message at 19:03 on May 26, 2012

Arzachel
May 12, 2012

Agreed posted:

Edit: Oh, hold on, the 7850 was artificially choked for overclocking and it turns out you can get more than that out of it? Reliably, or "the good cards?"

And I do think that it's possible nVidia took a lesson from Fermi not to cross the streams and may indeed relegate big Kepler to compute, it was a strange choice to build a card that has far more compute than a gamer will ever require - potentially never even use unless they're folding@home or something - and just accept the problems of making that high end compute chip also work for high end graphics. There's room for a separation there, and with the way big Kepler is designed, they could probably run it for some time as Quadro/Tesla workstation units, ignoring the gaming consumer market entirely.

I've yet to see a 7850 not hit 1100 mhz on the core. Some might need a bit more voltage, but the average is around 1150-1200 for most with outliers hitting ~1300 mhz on custom cooling. You could only strip the limit through a setting on ASUS overclocking tools at first, but it seems it has gotten easier lately.

The issue with delegating GK110 to only compute is that they can't offload R&D costs to consumers. Focusing on GK114 would probably make them more competitive on the high end for less, but it seems they're willing to make up for it with a huge chip.

Arzachel fucked around with this message at 19:37 on May 26, 2012

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
The 1GHz GPU overclock I got on my HD7850 is with stock voltage so did I just get a good one?

Edit: It's the stock clocked Sapphire card BTW.

spasticColon fucked around with this message at 19:59 on May 26, 2012

Arzachel
May 12, 2012
Hard to tell, voltages on the 7850 range from ~1.07v up to ~1.21. Try pushing it closer to 1100 if you have the time to mess around with it. The temperature increases from overclocking without overvolting are pretty negligible.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Arzachel posted:

Up to 50% overclocks limited by voltage throttling isn't anything I've seen before.

Could you point out some of these? 'Cause I'm looking but not having an easy time finding them. It seems more like "yeah, they overclock really well, you can pretty safely expect a good OC out of them and performance that deprecates the GTX 570 completely and pushes up against nVidia's top end last gen card for a great price" - which is an extremely laudable thing, of course, at the price point, I hope it's clear I'm not saying "pfffft that's nothin'" - but (multiple) 50% overclocking?

Reasonable expectations from what I'm seeing are more like a ceiling at 1100mhz-ish with some able and some not able to go much further. But if I'm looking in the wrong place, help me out, I keep up with nVidia's technology more than AMD/ATI's (example: I didn't know the previous 1050mhz ceiling was artificial :v:)

Arzachel
May 12, 2012

Agreed posted:

Could you point out some of these? 'Cause I'm looking but not having an easy time finding them. It seems more like "yeah, they overclock really well, you can pretty safely expect a good OC out of them and performance that deprecates the GTX 570 completely and pushes up against nVidia's top end last gen card for a great price" - which is an extremely laudable thing, of course, at the price point, I hope it's clear I'm not saying "pfffft that's nothin'" - but (multiple) 50% overclocking?

Reasonable expectations from what I'm seeing are more like a ceiling at 1100mhz-ish with some able and some not able to go much further. But if I'm looking in the wrong place, help me out, I keep up with nVidia's technology more than AMD/ATI's (example: I didn't know the previous 1050mhz ceiling was artificial :v:)

This one is on water I believe, the guy got the same card to 1260mhz core on air if I remember correctly: http://forums.anandtech.com/showthread.php?t=2245585

Between BIOS flashing and throttling woes, there several people running stable at 1200 core and over on air http://forums.anandtech.com/showthread.php?t=2239216&page=26

Found out about the earlier way to avoid the clock limitation here(OverclockersUK): http://forums.overclockers.co.uk/showthread.php?s=ecc68cd63d79e6708f55e701e85a34b5&t=18389760

Fake edit: oh wow. There is a way to run the cards at 1.3v without throttling now, the guy in that OCUK thread is running stable at 1350mhz core.

Real edit: There are two different reference 7850s. One using the same PCB as the 7770 and one on the same PCB as the 7870. Most 7870s run at 1.21v. You can probably see where this is going. Both the OC and non-OC Sapphire 7850 seem to use the 7870 PCB.

Arzachel fucked around with this message at 22:18 on May 26, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

So it's a bit of a retread of the 6950 flash to 6970 thing, then, but even more risky - sure, knock yourself out, but you're violating the warranty hardcore with a hacked BIOS and if it turns out that card had a non-artificial-segmentation reason for being a 7850, you might end up with a dead card.

If it's anything like previous AMD/ATI flash gambles, early bird gets the worm as they want to hit that market hard and grab up price:performance seekers as soon as possible before nVidia makes their next move. Hopefully TSMC isn't still lagging on production and they can nab the price:performance bracket while the nabbing's good.

Arzachel
May 12, 2012

Agreed posted:

So it's a bit of a retread of the 6950 flash to 6970 thing, then, but even more risky - sure, knock yourself out, but you're violating the warranty hardcore with a hacked BIOS and if it turns out that card had a non-artificial-segmentation reason for being a 7850, you might end up with a dead card.

If it's anything like previous AMD/ATI flash gambles, early bird gets the worm as they want to hit that market hard and grab up price:performance seekers as soon as possible before nVidia makes their next move. Hopefully TSMC isn't still lagging on production and they can nab the price:performance bracket while the nabbing's good.

Not quite. The BIOS flash only allows running the chip at 1.3v without throttling because shader units have been fused off. That said, this option allows for insane overclocking, the guy in that OCUK thread is running the core 60% higher than stock :drat: Everyone can make use of the beefier circuitry on the 7870 PCB though, a lot of people seem to settle for 1.21v and ~1200mhz core(+40%), which is the stock voltage on the 7870.

But yeah, BIOS mods are pretty risky and I wouldn't and don't do them myself.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

Agreed posted:

So it's a bit of a retread of the 6950 flash to 6970 thing, then, but even more risky - sure, knock yourself out, but you're violating the warranty hardcore with a hacked BIOS and if it turns out that card had a non-artificial-segmentation reason for being a 7850, you might end up with a dead card.

If it's anything like previous AMD/ATI flash gambles, early bird gets the worm as they want to hit that market hard and grab up price:performance seekers as soon as possible before nVidia makes their next move. Hopefully TSMC isn't still lagging on production and they can nab the price:performance bracket while the nabbing's good.

I always wonder where my warranty status lied with that 6950 shader flash, I was lucky and had a Sapphire with a literal switch on the card to flip to the unlocked shaders.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
I bought a used 6950 when I wanted to flash it. XFX (lol), but it has a double-lifetime warranty and the warranty text was pretty forgiving for mods or overclocking so I'm happy with it for the price. Since it was a pre-flashed card the only thing I really had to do with it was add in a better cooler. All of the reference 6950's have the BIOS switch - Sapphire was the only one IIRC that had a switch specifically for unlocking their non-reference cards.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
According to GPU-Z, the voltage on my 7850 is 1.075V so how far could I push it on that voltage?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

spasticColon posted:

According to GPU-Z, the voltage on my 7850 is 1.075V so how far could I push it on that voltage?

There really isn't a blanket statement that applies there. Overclock it, test it, come back and report. Help us build information to give people, experiences to share :)

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

Agreed posted:

There really isn't a blanket statement that applies there. Overclock it, test it, come back and report. Help us build information to give people, experiences to share :)

I have now pushed it to 1050MHz which is as far AMD OverDrive in the Catalyst Control Center will allow and the Heaven benchmark 3.0 has been running fine for over an hour now but I haven't messed with the VRAM clocks yet.

Arzachel
May 12, 2012

spasticColon posted:

I have now pushed it to 1050MHz which is as far AMD OverDrive in the Catalyst Control Center will allow and the Heaven benchmark 3.0 has been running fine for over an hour now but I haven't messed with the VRAM clocks yet.

To go over 1050mhz, you'll need to either use Sapphire Trixx or Asus Tweak. I would recommend to use the Sapphire app for actually overclocking but you might have to use the Asus app to remove the limitation, not sure if Sapphire allows that yet. It's generally not worth it to mess with the memory clocks too much because the increases are small and error correction kicks in pretty quickly. Find the highest core clock at the voltage you're comfortable with and only then bump up the memory clocks a bit, testing with some benchmark to see if your scores don't deteriorate with the additional memory clocks.

Oh, also set the power control settings to +20% in CCC, so you don't get throttled.

Arzachel fucked around with this message at 13:12 on May 27, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Also worth noting that the answer to "how much memory bandwidth does it have?" is already "plenty," so raising clocks - and as a result using power for the memory controller and GDDR5 instead of for, you know, the GPU and shaders - is pretty moot. I'd almost advocate leaving memory bone stock 'til you find your GPU clockrate wall and then only raise memory from there, as power permits.

Alpha Mayo
Jan 15, 2007
hi how are you?
there was this racist piece of shit in your av so I fixed it
you're welcome
pay it forward~
The GTX 670 comes in 2GB and 4GB models, if I would like to game at 2560x1440, would the 2GB version suffice? And I'd like to be able to use things like the High-Res texture pack for Skyrim, would that make any difference to go for the 4GB instead?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Meta Ridley posted:

The GTX 670 comes in 2GB and 4GB models, if I would like to game at 2560x1440, would the 2GB version suffice? And I'd like to be able to use things like the High-Res texture pack for Skyrim, would that make any difference to go for the 4GB instead?

Rule of thumb is don't bother with higher VRAM models unless you're planning on using them in SLI, since one GPU will be bottlenecking before the memory is. However, at very high resolutions, I mean, I guess, maybe it's possible there could be a reason you would want to have 4GB of VRAM... ... I dunno. Probably not. We were all fairly astounded that AMD/ATI launched with 3GB of GDDR5 on their card, they seem to have been betting on a lot of people doing multi-monitor. nVidia is appropriately saving costs by keeping it to a normal amount of VRAM for modern games. AMD/ATI blunder, a bit, there.

1GB isn't enough at those resolutions, certainly, but in the past, I remember a [H]ard-style test gaming with three 1440p monitors, tri-crossfire 6970s punished tri-SLI GTX 580s, which is a combination of the fact that at least Fermi and before, nVidia has a MASSIVE loss of GPU power per card past two, really bad scalability, but significantly the 1.5GB of VRAM was getting choked pretty badly at 3x1440p and the 6970s' 2GB of VRAM wasn't.

Not enough VRAM is a bad thing, but more than enough is really pointless. I would imagine that 2GB is plenty, even with one 2560x1440 monitor.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
2GB of RAM doesn't seem to limit the 680 in SLI when working with 3x1920x1080, so I don't think a single 2560x1440 monitor will provide any VRAM issues whatsoever.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

2GB of RAM doesn't seem to limit the 680 in SLI when working with 3x1920x1080, so I don't think a single 2560x1440 monitor will provide any VRAM issues whatsoever.

There you go, no problems expected and no need to spend the extra for needless VRAM :)

Goatman Sacks
Apr 4, 2011

by FactsAreUseless
Not sure if this is the right place to ask, but does anyone know if the GTX 680 works on folding@home? Their new streamlined client doesn't seem to want to get started on anything gpu-wise.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

nVidia's expanded GPGPU compatibility with Kepler, though GK114 (That's GTX 680/670) is kneecapped pretty badly for compute performance compared to the upcoming GK110.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
The FaH guys don't have a client that will fold on a 680, actually, so all that is moot until there's a software update.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
Yeah my HD7850 won't fold either until there is a software update for Folding@Home. As for the overclocking I'm going to leave it at the 1050MHz and call it a day. From what I've read online if I want to go higher I would probably have to bump up the voltage and I don't feel like doing that.

Mayo_Zedong
Nov 6, 2006

hai
I'm thinking of jumping on the Nvidia bandwagon but I have a concern. Back a few years ago when I owned a Nvidia card (8800 Ultra) I used to get a driver issue that many others got as well where your game or program would crash and a pop up box would come up saying "nvlddmkm stopped responding and has recovered." which would then require a Windows Restart. at the time the only solution that seemed to work was reverting to windows XP. After looking a bit online it seems this problem still exists with newer cards. So my question is has anyone else here got that problem? or is it just a small amount of users that still experience this?

Dotcom656
Apr 7, 2007
I WILL TAKE BETTER PICTURES OF MY DRAWINGS BEFORE POSTING THEM
Last time I experienced that error was back when I had a AMD 4870 (2008-2010).
Then again my power supply at the time was weird as hell and only supplied 6 volts on the 12 volt rail and 3. something on the 5 volt rail. I have no idea how that system stayed stable for so long.

Alpha Mayo
Jan 15, 2007
hi how are you?
there was this racist piece of shit in your av so I fixed it
you're welcome
pay it forward~
I haven't seen that error since the 8800GTS 320MB days (like 5 years ago). It always recovered for me too and I never had to restart, which is better than a BSOD even though it was annoying.

Cavauro
Jan 9, 2008

I had that issue very frequently with an EVGA GTX260. Probably twice a week at least, and I assumed I'd just wait until the card died and buy something else. It never died but after buying a 560 Ti last November I haven't had it again. I have had another similar problem where video will make the screen pink and either recover or not, but that went away with the latest driver.

EvilCoolAidMan
Jun 26, 2008
Does anyone know if the new Thunderbolt equipped motherboards from MSI and ASUS can drive a Thunderbolt display through a graphics card? Or Do we need to wait for NVidia and AMD to build in the support for it?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
They can using Lucid Virtu. Otherwise they cannot, as there's no DP connection from the video card to the Thunderbolt adapter.

https://www.youtube.com/watch?v=O1t7Rc9qFgI

mike_348
Apr 30, 2009
Picked up a Gigabyte 670 2GB overclock. I haven't tried overclocking it yet, but so far I'm running Crysis 2 at Ultra High, Battlefield 3 at Ultra High and Skyrim HD at Ultra High all at silky smooth FPS.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
AnandTech has a dual feature on an industrial PC and the "Cedar Trail" Intel Atom that runs it. Cedar Trail is a rarity among Intel chips - its IGP is not Intel-designed, but rather an IP block from PowerVR, the SGX 545. The GPU is branded as the GMA 3600/3650, depending on clock speed.

The PowerVR Series 5 (a.k.a. SGX) is one of those low-power-optimized architectures. It is entirely DirectX 10.1 compliant, but like Intel's HD Graphics GPUs, much is accomplished in efficient fixed-function hardware. It's used extensively in smartphones and tablets, such as the iPhone 4, first iPad, Palm Pre, BlackBerry Playbook, and Samsung Galaxy S. It runs at very low clockspeeds - the SGX 545 is the most powerful unit in Series 5, and its as-designed clockspeed is only 200 MHz. Intel runs the GPU at up to 650 MHz in Cedar Trail, however.


Block diagram of an SGX GPU

You might think it's odd that a phablet GPU is being put in a netbook platform. Well, this third-gen Atom core is identical to the first-gen Atom core, just clocked higher; it's not much faster than top-end phablet CPUs at this point, so why not give it graphics to match?


The Cedarview SoC, including the Cedar Trail CPU core and PowerVR GPU IP block

That second block diagram and other material promise a lot from this GPU and its associated hardware for this update to the Atom platform:
  • DirectX 10.1 support
  • Hardware-accelerated HD video decoding
  • Twice the performance of previous generation Atom's graphics
But guess what? None of that loving works.

Intel has provided the shittiest of drivers for the GMA 3650. The launch drivers had major problems with screen tearing and stuttering... on the Goddamn desktop. The GPU can't handle a Windows desktop, regardless of settings - any resolution, Aero on or off. And the update package for newer drivers hoses your OS install and prevent you from entering Windows at all. You have to flatten and reinstall if you're updating from the launch drivers.

Once you have those new drivers installed, things are improved... somewhat. You can now display a blank desktop properly, but if you get saucy and move a window, the system lags to hell. HD video decode? A solid "Almost." 720p YouTube works. 1080p YouTube stutters all over and drops frames. Netflix and Hulu are SD-only.

And did I mention? Windows 7 32-bit only. No 64-bit, no other OS, not even Linux.

Intel can write better drivers. The phone version of Cedar Trail, Medfield, has fantastic Android support, and HD 4000 is a paragon of functionality compared to GMA 3650. But they have not written better drivers. GMA 36x0's drivers being as they are for a shipping product from Intel in 2012 is just crazy.

MeruFM
Jul 27, 2010
EVGA gtx670FTW edition report:
Got the core clock to 1204mhz. 1205 crashed in Unigine after 2 hours.
Now testing memory but it doesn't seem to go much past 6250mhz. Not worth OCing since the stock OC is already 6200Mhz

Compared to my old GTX470 with 15% core OC, it's drat impressive. Also very quiet.

2560x1440 battlefield 3 maxed
GTX670: 45-50fps, 1.7GB Vram
GTX470: 10-20fps

2560x1440 Diablo 3 maxed
GTX670: 60fps, 1.5GB Vram
GTX470: 45-50fps

1920x1080 Super Mario Galaxy emulator
GTX670: 55-60fps
GTX470: 40-50fps

2560x1440 Skyrim maxed, HD texture mods, and crazy Russian man's ENB graphics wrapper with all the bells and whistles:
GTX670: 35fps. Consistently capping the 2GB Vram. Slow loading and occasional hiccups.
GTX470: Crash, computer frozen. Not kernel panic.

1920x1080 Skyrim maxed, HD texture mods, and crazy Russian man's ENB graphics wrapper with all the bells and whistles:
GTX670: 50-60fps. 1.7-2GB VRAM used. Still slow loading.
GTX470: 17fps

2560x1440 Crysis Warhead Maxed
GTX670: Will report later
GTX470: 25fps

I was slightly worried about the 2gb vram and my fears were somewhat confirmed. Hitting 2GB at 2560x1440+ is pretty easy I suspect for future games, but the only released game that consistently caps was Skyrim with some outrageous 2048x2048 textures.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Thanks for that, because it does show you can max out 2GB VRAM even with a single monitor, which I've suspected. Just at 1920x1200 I've seen 1.8GB used on my 6950 when rolling Skyrim with a ton of mods, but admittedly that was an extreme peak.

Adbot
ADBOT LOVES YOU

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

Intel is shipping poo poo (human fecal matter) and getting people to buy it.

Think about this: it ONLY supports DX10.1, and ships 9.0 natively. We're 2 years and a service pack into windows 7 and they're shipping DX9 hardware.

incoherent fucked around with this message at 09:04 on May 29, 2012

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply