Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
mayodreams
Jul 4, 2003


Hello darkness,
my old friend

Factory Factory posted:

Two neat AnandTech articles today:

First, it followed up on its Thunderbolt article, full of Asus engineering slides that us mere mortals can only begin to understand.


I finally got around to reading this article, and the slides were amazing. My undergrad was in ECE, so it took me back to making PCB's for projects.

I find it really interesting that there are no angles in the traces, and they had to hollow out the layer under it too.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

:stare:

The GTX 680 is an extremely, extremely high performance card. I've left it at stock voltage and I'm slightly less than doubling my previous 3Dmark11 score, with my former card, a GTX 580, clocked at 920MHz. I don't consider an overclock "stable" unless it has zero artifacts ever under any circumstances. I think I've actually got some room to go, but all I did was change three things:

1. Power target from 100% to 132% (max on the EVGA Precision X slider)
2. GPU Clock Offset +100
3. Mem Clock Offset +300

That's it. The firmware on the card, which is stock EVGA SC+ firmware, takes care of the rest. It seems to kinda just do what it wants in games, I've seen high clocks and occasionally a dip (it does seem to be able to adjust in really small increments, when it "downclocks" it's only by like 6mhz, or 13mhz, never farther than that) but it generally hangs out around 1290mhz on the GPU and sticks to the memory clock solidly.

I do have a more aggressive fan profile set up, to keep it under 70ºC to be on the safe side.

Also, I did a bunch of research and was delighted to find out that my power supply, a Corsair HX750, is made by CWT, not Seasonic, and is technically certified Gold but they downrated it at 750W because at higher temps it slips down to silver a bit - and that it can put out over 900W before any of its safety features kick in. Another way of looking at it is that it's an 850W 80+ Bronze power supply. So, I decided to keep my GTX 580 in, returned to stock clocks, to hang out and perform PhysX duties since they make any single card setup totally eat poo poo, but the comparatively exceptional compute performance of the GF110 chip means it does PhysX really well. Holy smooth framerates with PhysX enabled in Batman games, Batman!

Total system power usage doesn't top 600W-650W under a load like that, and most of the time the GTX 580 hangs out at its lowest power and clock states. While there aren't too many games that take advantage of PhysX, the ones that do are extra badass now. :kiddo: And I can keep hoping to find some uses of CUDA since I've got a lifetime warranty and an advance RMA on the 580, and I'm hanging onto it until I can cash that in when it goes tits up :mad:

Agreed fucked around with this message at 04:49 on Jun 5, 2012

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
After reading about how fast the GTX 680 and 670 are I'm worried that Nvidia's mid-range offerings will be faster but cost the same or even less and I already pulled the trigger on a HD7850. I'm worried a GTX 660Ti will cost the same and yet be faster and overclock more.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Agreed posted:

is made by CWT

You can usually tell by the fact Channel Well uses crinkly green tape around the transformers. Oh, and loose screws (that one comes from Jonny Guru).
I'm not a fan, but mainly because I had so many Antec PSUs that were Channel Well made, and they cost quite a bit. They all failed, because of the Fuhjyyu capacitors. I doubt they're using them now, but that left me with a bitter taste.

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:

Maybe amd and nvidia should work together to release products on a staggered 6 month schedule so they can just go back and forth with the performance crown and encourage people in one half of the calendar year to buy amd, and people upgrading in the other half of the year to buy nvidia. Then they could both have a healthy business and they'd still be competing with each other

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

HalloKitty posted:

You can usually tell by the fact Channel Well uses crinkly green tape around the transformers. Oh, and loose screws (that one comes from Jonny Guru).
I'm not a fan, but mainly because I had so many Antec PSUs that were Channel Well made, and they cost quite a bit. They all failed, because of the Fuhjyyu capacitors. I doubt they're using them now, but that left me with a bitter taste.

They aren't; its construction is better than contemporary Seasonics (that Corsair is still using). And there's not a loose screw one in the unit. It's a fantastic power supply at 750W, rather astounding that Corsair is being exceedingly fair in their labeling of it to call it Silver in the first place when 80Plus and their room-temperature testing certified it gold. It drops a little in efficiency, just slightly but enough to be in the silver range in a 50ºC test environment.

Basically, it's powerfully overspecced for its stated usage. Says nothing about other CWT supplies in lower wattage or whatever, but this unit tends to have somewhat more of everything than it absolutely has to, compared to its technological contemporaries, and that's why running Metro 2033 was a breeze even though the GTX 680 (overclocked a lot!) and GTX 580 (stock clocks) were both running at full core/memory.

drat does that make the experience smooth, too (edit: welp finally got metro to smooth vsync with everything on, just in time for them to up the stakes in the next version I'm sure, just shoot me now so I don't keep doing this please). A dedicated PhysX card is a lot cooler for games that support it than Ageia made it out to be, conceptually, but it is exactly as situational as everybody figured it would be in practice.

Edit: Though I'm not sure the left hand is talking to the right when it comes to PhysX - if you look at some games' recommendations they suggest you use a 9800 GTX or GTX 260, but both of those just slow a modern card down and have since Fermi's fumbling arrival. If your dedicated PhysX card isn't at least a generation recent and at least in the good price:performance mid-range bracket you're probably going to slow down a modern top-end card, which is weird given its memory bandwidth I guess but still. A GTX 580 is stupid overkill, but if you just happen to have one sitting around... More likely the least you could get away with if you were silly enough to buy intentionally would be a 560, maybe a 560 Ti, and then only because Fermi's compute is ultra-super-badass compared to Kepler's.

Agreed fucked around with this message at 14:31 on Jun 5, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Here's the skinny on GK110. 7.1 billion transistors, 15 compute-optimized SMXs, 2,880 CUDA cores, 288 GB/s of memory bandwidth. But it still looks like it's optimized for real-time graphics... :circlefap:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

Here's the skinny on GK110. 7.1 billion transistors, 15 compute-optimized SMXs, 2,880 CUDA cores, 288 GB/s of memory bandwidth. But it still looks like it's optimized for real-time graphics... :circlefap:

In the sense that any gigantodie based on the same underlying architecture is necessarily going to be - I guess it's possible they could put that into the consumer market space but it'd be going very sharply against a number of tremendous accomplishments with the GTX 680. It seems to me more likely that they intend to keep videogames and workstation/compute markets actually, rather than artificially segmented...

If ATI does something that requires a response, I feel pretty confident based on the performance of the 680 that nVidia will have one without having to more than double the transistor count with a consumer product that's highly compute-focused and thus pretty inefficient at the given task. That would be a weirdly desperate move which seems unlikely to be necessary. Take back all the neato stuff gained from the Fermi --> Kepler move, d'oh.

CactusWeasle
Aug 1, 2006
It's not a party until the bomb squad says it is
I just replaced a 6850 with a 7850 and quite pleased even at stock settings. It also seems to run 10 degrees cooler under load than the 6850. Both are Sapphire cards, the new card having two fans, the 6850 had one.

The stock setting for the 7850 is:

GPU clock : 860
Memory: 1200
Board Power Li 0
VDDC 1210

I had to upgrade Trixx and some of these things have changed name or whatever, so i dont really know whats what. Can someone please give me a moderate overclock setting for it? I did get some setting for my 6850 here which really helped that aswell.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
The factory overclocked Sapphire 7850 has a 920MHz core clock, so I'd put money on that being safe, since you have the same card minus the OC.

vv Sorry, I don't have the exact info, I just thought I'd give you a figure you can probably expect to achieve

HalloKitty fucked around with this message at 13:25 on Jun 6, 2012

CactusWeasle
Aug 1, 2006
It's not a party until the bomb squad says it is

HalloKitty posted:

The factory overclocked Sapphire 7850 has a 920MHz core clock, so I'd put money on that being safe, since you have the same card minus the OC.

Do i need to change any of the other settings too?

Thanks

Paino
Apr 21, 2007

by T. Finninho
New owner of a vanilla Gtx 670, with 301.42 drivers.

Diablo 3 runs like utter poo poo, with noticeable slowdowns when a lot of things are happening on screen. Drivers 301.42 seem to be the culprit from what I'm reading around. Anyone having the same issues?

Also, why is MSI Afterburner showing me a Core Clock of only 705 mhz (and Shader clock at 1411mhz)? Shouldn't Core be higher, 905 or something?

Tunga
May 7, 2004

Grimey Drawer

Paino posted:

New owner of a vanilla Gtx 670, with 301.42 drivers.

Diablo 3 runs like utter poo poo, with noticeable slowdowns when a lot of things are happening on screen. Drivers 301.42 seem to be the culprit from what I'm reading around. Anyone having the same issues?

Also, why is MSI Afterburner showing me a Core Clock of only 705 mhz (and Shader clock at 1411mhz)? Shouldn't Core be higher, 905 or something?

Have you checked your temperatures? Maybe it is throttling?

Paino
Apr 21, 2007

by T. Finninho

Tunga posted:

Have you checked your temperatures? Maybe it is throttling?

Tried EVGA Precision and it shows the right clocks when I'm in game. Temperature are between 55C and 65C.

Diablo is still choppy when there's a lot happening on screen. I'm using vsync+ triple buffering (screen tearing is horrible in this game) and while I'm usually at 60fps, whenever the screen is full of monsters exploding/spell effects it can drop to 40-20 for 3-4 seconds, than it gets back to 60. It's incredibly annoying and I hope Nvidia actually releases decent drivers for the game EVERYONE IS PLAYING.

EDIT: http://eu.battle.net/d3/en/forum/topic/4551306149 -> this is the issue everyone's having with 500 and 600 series cards. This is very noticeable in Act 3 right before Siegebreaker: with a lot of monster/effects/smoke on screen FPS occasionally dip to 15-20, making the game unplayable. I can't even revert to 296.10 drivers because my gpu is not supported; people with older card have reported smooth performance without 301 drivers. Sorry for the off-topic, I hope this helps other goons in this situation.

Paino fucked around with this message at 13:10 on Jun 8, 2012

Josh Lyman
May 24, 2009


Paino posted:

Tried EVGA Precision and it shows the right clocks when I'm in game. Temperature are between 55 and 65.

Diablo is still choppy when there's a lot happening on screen. I'm using vsync+ triple buffering (screen tearing is horrible in this game) and while I'm usually at 60fps, whenever the screen is full of monsters exploding/spell effects it can drop to 40-20 for 3-4 seconds, than it gets back to 60. It's incredibly annoying and I hope Nvidia actually releases decent drivers for the game EVERYONE IS PLAYING.
I recently got a 560 Ti and I have no issues with D3 on 301.42, and I enabled vsync and triple buffering @ 1920x1080.

This is probably stupid, but have you tried reinstalling drivers? You could also roll back to the last version.

chippy
Aug 16, 2006

OK I DON'T GET IT
Isn't D3 quite CPU-bound? What processor do you have? Any chance it could be network latency?

Paino
Apr 21, 2007

by T. Finninho

Josh Lyman posted:

I recently got a 560 Ti and I have no issues with D3 on 301.42, and I enabled vsync and triple buffering @ 1920x1080.

This is probably stupid, but have you tried reinstalling drivers? You could also roll back to the last version.

I edited my previous post to include the Battle.net thread. I have cleaned everything with Driver Sweeper before reinstalling, but as I said I can't roll back to 296.10 because my gpu (gtx 670) is not supported. It literally doesn't let me install them.

My CPU is an i5 2500k, my HD is an Intel SSD, I have 8 gb ram. People have been saying to set the affinity to 1 core in D3 app properties but it seems dumb to me. I may try it if nothing else works. It's not latency, the FPS drops are noticeable with Fraps on.

Josh Lyman
May 24, 2009


Paino posted:

I edited my previous post to include the Battle.net thread. I have cleaned everything with Driver Sweeper before reinstalling, but as I said I can't roll back to 296.10 because my gpu (gtx 670) is not supported. It literally doesn't let me install them.

My CPU is an i5 2500k, my HD is an Intel SSD, I have 8 gb ram. People have been saying to set the affinity to 1 core in D3 app properties but it seems dumb to me. I may try it if nothing else works. It's not latency, the FPS drops are noticeable with Fraps on.
I recall having to do the affinity thing for something a couple years back and it did, in fact, fix my problems. Can't recall what for though. :(

evilalien
Jul 29, 2005

Knowledge is born from Curiosity.

Paino posted:

Tried EVGA Precision and it shows the right clocks when I'm in game. Temperature are between 55C and 65C.

Diablo is still choppy when there's a lot happening on screen. I'm using vsync+ triple buffering (screen tearing is horrible in this game) and while I'm usually at 60fps, whenever the screen is full of monsters exploding/spell effects it can drop to 40-20 for 3-4 seconds, than it gets back to 60. It's incredibly annoying and I hope Nvidia actually releases decent drivers for the game EVERYONE IS PLAYING.

EDIT: http://eu.battle.net/d3/en/forum/topic/4551306149 -> this is the issue everyone's having with 500 and 600 series cards. This is very noticeable in Act 3 right before Siegebreaker: with a lot of monster/effects/smoke on screen FPS occasionally dip to 15-20, making the game unplayable. I can't even revert to 296.10 drivers because my gpu is not supported; people with older card have reported smooth performance without 301 drivers. Sorry for the off-topic, I hope this helps other goons in this situation.

I've noticed the exact same thing on my 670 in act 3 before the Siegebreaker on my hell run which didn't happen on previous difficulties. I'm not sure if one of the new Diablo patches caused this or if it was a new driver version that I installed.

On a separate note, I'm have to RMA my EVGA 670 because the fan makes a terrible grinding noise even at idle. At least they are cross-shipping the replacement.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
U.S. Government to AMD: Your drivers are lovely, be more like Intel.

It's about high-value IT security. Intel and Nvidia's drivers all support memory address randomization, which makes it harder to probe for security flaws. AMD's drivers do not, so address randomization cannot be used with AMD hardware.

madsushi
Apr 19, 2009

Baller.
#essereFerrari
If you have a 670/680 and you're having problems with D3, it's probably due to V-Sync/Adaptive V-Sync. I know several people who all had this issue (or you can read the 100+ page thread on the nVidia forums).

The solution is to install the 302.71 beta nVidia drivers, which aren't "officially" even released in beta yet. After installing them, I've had no issues in D3.

Star War Sex Parrot
Oct 2, 2003

https://www.youtube.com/watch?v=MOvfn1p92_8

Unreal Engine 4 stuff from E3. That lighting model is insane. The tools are drastically improved too. I'm glad Epic is trying to address the "more detail = more development costs" with better tools.

Berk Berkly
Apr 9, 2009

by zen death robot

Star War Sex Parrot posted:

Unreal Engine 4 stuff from E3. That lighting model is insane. The tools are drastically improved too. I'm glad Epic is trying to address the "more detail = more development costs" with better tools.

I remembered our little aside about Unreal 4 being the software that is going to leverage our GPU hardware of the future and was coming here to post that video since we only had a few preview images to go by last time.

The tools they updated it with are amazing:

Tool Highlights posted:


Make updates directly in game without ever pausing gameplay with Hot Reload. This tool allows you to quickly find and edit C++ code and see those changes reflected immediately in game.

After an update is made, Instant Game Preview gives you the power to spawn a player and play anywhere in game without needing to wait for files to save.

The all-new Code View saves you time by allowing you to browse C++ functions directly on game characters then jump straight to source code lines in Visual Studio to make changes.

Live Kismet Debugging enables you to interactively visualize the flow of gameplay code while testing your game.

Now you can quickly debug and update gameplay behaviors when they happen through the new Simulate Mode. This tool lets you run game logic in the editor viewport and inspect AI as the game characters perform actions.

View your game in full-screen within the editing environment with the Immersive View tool. This allows programmers to complete iterations on gameplay changes without added UI clutter or distractions.

Possess/Eject Features allow at any time while playing in editor to easily “eject” from the player and take control of the camera to inspect specific in-game objects that may not be behaving properly.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
We need a 600 pixel wide :circlefap: for that. I love that indirect lighting.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

madsushi posted:

If you have a 670/680 and you're having problems with D3, it's probably due to V-Sync/Adaptive V-Sync. I know several people who all had this issue (or you can read the 100+ page thread on the nVidia forums).

The solution is to install the 302.71 beta nVidia drivers, which aren't "officially" even released in beta yet. After installing them, I've had no issues in D3.

That's a really dangerous solution for anyone not running Windows 8 it seems. Has a decent chance of hosing your system even installed correctly. I haven't had any issues with overclocking or with Adaptive Vsync, but I don't play Diablo 3 either. Going to go ahead and wait on these to become official before I ruin a good thing to try the next driver set, even though I do expect some performance/feature improvements in the coming updates, as with most new tech. We're, what, two driver releases into Kepler now? Hell, one release (maybe it was the first 300, I don't remember off the top of my head) made some up to 40% performance improvements in some circumstances in Fermi-based cards, I'm not discounting any possibilities with regard to the hack-packs that are drivers. But I'm also not going to gently caress around with a literal hack to even install the thing, when I haven't had any issues.




Edit: Unreal 4, holy poo poo. I love what's going on with deferred rendering engines lately, can't wait for that tech to start punishing my system as soon as humanly possible. Also, use PhysX please, thank you.

Berk Berkly
Apr 9, 2009

by zen death robot

Factory Factory posted:

We need a 600 pixel wide :circlefap: for that. I love that indirect lighting.

Well you could cram that many into a 2.3 inch LCD thanks to the japanese:

http://www.tomshardware.com/news/Japan-Display-Inc-Retina-Display-651ppi-pixels-per-inch,15913.html


Speaking of which, would it be too much to ask to have 300ppi++ monitors instead the ever widening displays we have? I mean, its going to be a complete waste of hardware to have less than two monitors eventually if the ppi density doesn't go up even for entry level GPUs.

tijag
Aug 6, 2002

Agreed posted:

:stare:

The GTX 680 is an extremely, extremely high performance card. I've left it at stock voltage and I'm slightly less than doubling my previous 3Dmark11 score, with my former card, a GTX 580, clocked at 920MHz. I don't consider an overclock "stable" unless it has zero artifacts ever under any circumstances. I think I've actually got some room to go, but all I did was change three things:

1. Power target from 100% to 132% (max on the EVGA Precision X slider)
2. GPU Clock Offset +100
3. Mem Clock Offset +300


My GTX 680 [EVGA] crashes games if i run at GPU clock offset +70 and Mem Clock Offset + 250.

You have a great sample.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

tijag posted:

My GTX 680 [EVGA] crashes games if i run at GPU clock offset +70 and Mem Clock Offset + 250.

You have a great sample.

That's on top of the stock boosts thanks to the the EVGA "SC+" SKU, too.

I ended up backing it down after extended runs showed some marginal instability, final offsets for 100% stability and stupid good performance are +87 core +300 memory, 132% power target (why not?... custom fan profile keeps it under 55ºC in Metro 2033, thanks to Adaptive Vsync, what a great power saving tool; I've seen it get up to ~110-118% but no higher in use, but as it is the 4-phase rather than 5-phase VRM design I'd rather not risk it). It does a great job of going as fast as it needs to to accomplish the rendering task of the moment, I've seen it boost as high as 1300mhz on the core but generally it's in the 1260s-1280s range.

I did set the Voltage to 1.15V, seems to help with memory offset stability starting closer to where it's going to end up anyway. In demanding usage scenarios it gets to 1.175V, which is as high as it'll go, automatically based on TDP and regulators I guess.

I am definitely pleased as punch with how its performance turned out, I've been reading around and people seem to think the EVGA SC cards are binned for specific performance targets, which does follow what little we know/lots of speculation about their general division of chips into different SKUs. If that's true, mine's hitting fairly above average for an SC/SC+ (only difference is the backplate, which is "supposed" to improve cooling, I have no idea if it does or not but the card does stay cool and quiet with a custom fan profile and the stock EVGA blower).

Tunga
May 7, 2004

Grimey Drawer
How do I get games to run across two monitors on my GTX 670? I'm used to ATI and under that...well it just worked. Currently trying to play Burnout Paradise and I'm getting both "screens" show up on one monitor with giant borders. What am I missing?

KillHour
Oct 28, 2007


I'm pretty sure it has to be 3 monitors, 2 won't work. Why would you want to play a game across 2 monitors, anyways?

Tunga
May 7, 2004

Grimey Drawer

KillHour posted:

I'm pretty sure it has to be 3 monitors, 2 won't work. Why would you want to play a game across 2 monitors, anyways?
Umm, is this a joke? I am hosed if this doesn't work. Some games are not retarded and enable you to shift the camera focus such that two monitors actually works. Most importantly for me that is EVE. But also, as it happpens, Burnout Paradise.

ATI handles this poo poo perfectly. This is the first nVidia card I have bought since an 8800GTS four cards ago. And that's not because I love ATI but just because, you know, they made better cards at the price point I wanted during that time.

I'm Googling the poo poo out of this and all I can find is people complaining that Horizontal Span isn't possible in Win 7? Is this a joke? Please tell me it is a joke :( .

Edit: Seriously I just spent £350 on this thing and it can't even do two monitors? My 6870(s) could do this two years ago. Surely nVidia are not this retarded? I really don't want to have to RMA this thing just because it cannot do such a simple thing :( .

Tunga fucked around with this message at 01:22 on Jun 9, 2012

KKKLIP ART
Sep 3, 2004

Got my Asus 670 in and dear jesus it is wonderful. So much more quiet than my GTX260. Maybe when I get some extra cash I'll get one of those fancy Korean IPS displays and push this thing.

mind the walrus
Sep 22, 2006

Tunga posted:

Umm, is this a joke? I am hosed if this doesn't work. Some games are not retarded and enable you to shift the camera focus such that two monitors actually works. Most importantly for me that is EVE. But also, as it happpens, Burnout Paradise.

ATI handles this poo poo perfectly. This is the first nVidia card I have bought since an 8800GTS four cards ago. And that's not because I love ATI but just because, you know, they made better cards at the price point I wanted during that time.

I'm Googling the poo poo out of this and all I can find is people complaining that Horizontal Span isn't possible in Win 7? Is this a joke? Please tell me it is a joke :( .

Edit: Seriously I just spent £350 on this thing and it can't even do two monitors? My 6870(s) could do this two years ago. Surely nVidia are not this retarded? I really don't want to have to RMA this thing just because it cannot do such a simple thing :( .

I guess it's more that we wonder why you'd want to do such a thing. Even in games where having the screen across two monitors is the most minimally invasive, you still have two end bezels blocking out the critical center of your screen. This isn't an issue with 3 monitors because you can allow for "blind spots" and make up for it with increased "flank" vision, but with two I can't think of a game where it wouldn't be crippling.

I mean I guess you're right that the card makers should allow you to have that option if you really really want to, but you really don't see how stupid it is to want that in the first place? Seriously how do you even play that way?

Jabor
Jul 16, 2010

#1 Loser at SpaceChem

mind the walrus posted:

I guess it's more that we wonder why you'd want to do such a thing. Even in games where having the screen across two monitors is the most minimally invasive, you still have two end bezels blocking out the critical center of your screen. This isn't an issue with 3 monitors because you can allow for "blind spots" and make up for it with increased "flank" vision, but with two I can't think of a game where it wouldn't be crippling.

I mean I guess you're right that the card makers should allow you to have that option if you really really want to, but you really don't see how stupid it is to want that in the first place? Seriously how do you even play that way?

You don't see how playing on one monitor with extra peripheral vision on one side (instead of two) is better than having no extra peripheral vision?

Tunga
May 7, 2004

Grimey Drawer
I didn't realise this was considered quite so batshit crazy but some games have an option to offset the "center" of the game such that the extra monitor exists as peripheral vision and additional UI space. For example, EVE.

Anyway I did get it working with EVE, since it will run in a full screen window. It seems this frankly stupid limitation only applies to full screen games so I can live with that.

It's still dumb.

mind the walrus
Sep 22, 2006

Jabor posted:

You don't see how playing on one monitor with extra peripheral vision on one side (instead of two) is better than having no extra peripheral vision?

Considering how large most displays are and how most games are formatted to run on one display first and foremost... I don't really think it's a big loss.

Tunga
May 7, 2004

Grimey Drawer

mind the walrus posted:

Considering how large most displays are and how most games are formatted to run on one display first and foremost... I don't really think it's a big loss.
EVE is not most games and requires littering your screen with a hundred different chat windows and lists of players and buttons and more chat windows, and it was the main reason I bought the 670. For Burnout et al, sure, it's not really a big thing.

Anyway problem solved. Now to see how this thing will overclock...

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

mind the walrus posted:

I guess it's more that we wonder why you'd want to do such a thing. Even in games where having the screen across two monitors is the most minimally invasive, you still have two end bezels blocking out the critical center of your screen. This isn't an issue with 3 monitors because you can allow for "blind spots" and make up for it with increased "flank" vision, but with two I can't think of a game where it wouldn't be crippling.

I mean I guess you're right that the card makers should allow you to have that option if you really really want to, but you really don't see how stupid it is to want that in the first place? Seriously how do you even play that way?

Even in WoW you could get addons to move and constrain the viewport, so I imagine the same thing exists with other games.

Not saying it's a common need, but there's no logical reason to disallow it.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Factory Factory posted:

U.S. Government to AMD: Your drivers are lovely, be more like Intel.

It's about high-value IT security. Intel and Nvidia's drivers all support memory address randomization, which makes it harder to probe for security flaws. AMD's drivers do not, so address randomization cannot be used with AMD hardware.

Update to this:

First, it wasn't the US Government, technically it was a security group at my alma mater.

Second, AMD sulked a bit that it only affected users who screwed around with security settings, but will fix the bug.

Adbot
ADBOT LOVES YOU

Fortuitous Bumble
Jan 5, 2007

What are people's experiences with the fan noise from the GTX 670 blower setup? My current Radeon 5770 makes this annoying whining/grinding sort of noise even at idle, I don't know if they all do that but I'd like to avoid that. I was originally looking at the Asus version to avoid the noise issue, but it seems to be perpetually out of stock and I saw this EVGA version with the 680 PCB for a bit cheaper but as far as I can tell it has the reference cooler. Or it might be the 680 cooler, if there's any difference.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply