Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

It sucks but I had pre-op evaluation today and as a result I am completely exhausted and in no shape to address the critiques, valid and somewhat more debatable, of my general point - but I am very glad to see the discussion happening, and glad I was able to kick it off. Glad it's been civil, proving again that here in SH/SC we're not a bunch of idiot fanboys, we're computer users who know what's up and are able to discuss a controversial issue with understandable passion but no vitriol. I wish I were in a state to add more of my thoughts, but it's been a long day. I think I have had more ionizing radiation by this point than most people will get in their whole lives. Holy moly. But three cheers for a civil discussion and thanks Movax and Alereon for taking time out of watching general goings-on to add your bits as well.

How the 8000 refresh launches will be somewhat determinant as to whether I keep a fairly strongly negative view of modern ATI drivers. I do want to clarify something - I used terminology earlier that wasn't right, I meant to point to ATI's issues with having two utilities necessary for especially Crossfire gaming. It took them too long, in my opinion, to weld those disparate (and problematically versioned) but necessary chunks of driver software together.

But then, recently nVidia's got a bit of bloat going on adding an unnecessary "Experience" auto-tuner tool, which I think is a huge waste of time and a small waste of resources for folks who own high end cards. I could be overestimating the PC gaming market but I feel it's really dumb to have a fully integrated, solid as a rock control panel software interface AND this epiphenomenal, tacked-on thing to try to do stuff for you that you ought to be perfectly able to do yourself.

These are both user-experience criticisms, one is that ATI was slow to integrate two significant pieces of software into one seamless control panel; nVidia has unnecessarily bloated their user experience now, but it may be of some use, I don't know.

Finally, I am still way, way not on board with the idea that beta drivers = instability in lightly accelerated applications. Hasn't been an issue with nVidia drivers since the 170-era, their modern beta drivers are literally just to get a driver out quicker without having to dick with Microsoft's WHQL process. For a more "exciting" experience, use the developer drivers - that's a lot closer to what AMD/ATI's standard issue Beta driver process is like. I think it's unreasonable that AMD's graphics division would expect users to be cool with random stuff going all wrong. Actually, I'd kind of like some corroboration that such is even the case more broadly - it sounds to me like that might be a system gestalt issue (loving gremlins!), not necessarily a driver problem.

Anyway. I'll lay off them if they manage good software support for the upcoming hardware and don't break support for the 7000 series or 6000 series in the process (hasn't official support already been discontinued for the 5000 series, or was that one generation earlier?... either way, too early in my opinion).

Adbot
ADBOT LOVES YOU

FetalDave
Jun 25, 2011

Moumantai!

Alereon posted:

There's two stuttering problems: the micro-stutter associated with Crossfire, and the general stutter caused by the AMD drivers which would also occur on single cards. The latter problem is already significantly improved and is getting better with each Beta driver update, but is still an issue.

This. Microstutter is only something that occurs with a dual GPU setup. I have a single GPU setup and it still happens.

I might be getting closer to figuring it out though. In my thread in tech support someone sugguested running FrafsBenchViewer while the stuttering is happening, and there is almost exactly 1 second between the stutters. Maybe it's something in the drivers polling the card for info every second?

FetalDave fucked around with this message at 03:08 on Jun 6, 2013

EA Sports
Feb 10, 2007

by Azathoth
The person who brought up stuttering talked about it being bad in CS:GO. Honestly I use a gtx250 and my friend uses a gtx570 and the game still has stuttering issues. I've always had problems with source engine games because of that, though I was surprised that CS:GO had it because I remember Valve basically patching it out of every source game other than HL2, which still stutters for me today.

Klyith
Aug 3, 2007

GBS Pledge Week

FetalDave posted:

This. Microstutter is only something that occurs with a dual GPU setup. I have a single GPU setup and it still happens.

I might be getting closer to figuring it out though. In my thread in tech support someone sugguested running FrafsBenchViewer while the stuttering is happening, and there is almost exactly 1 second between the stutters. Maybe it's something in the drivers polling the card for info every second?
I replied in your thread, but for the benefit of this one and the discussion of whether AMD's drivers are good or bad:

quote:

Vsync actually makes it worse. On top of the jitteryness, there's now a distortion bar that runs horizontal and moves from the top of the screen to the bottom every 5 seconds.
AMD does not write drivers that badly.

They may not be as good as nvidia, but I'm pretty sure they can catch a bug that obvious. I feel quite bad for them that in situations where poo poo is broken and their card happens to be present, people will just say "their drivers suck, get an nvidia card".

forbidden dialectics
Jul 26, 2005





Well, it's like anything - when it works, it's great; if it's broken, you hate it.

I just dumped my Crossfire 5850s that I was happy with for nearly 4 years for a single GTX 670, and the improvement is night and day for the games I play. Well, no poo poo! The games I played were broken or didn't scale with Crossfire! But in all the others - I probably could have just held out indefinitely.

The nVidia drivers do objectively have more features - adaptive vsync and forcing FXAA are pretty loving awesome.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

Nostrum posted:

The nVidia drivers do objectively have more features - adaptive vsync and forcing FXAA are pretty loving awesome.

This reminds me why I urge everybody with an AMD card to use RadeonPro. Seriously, ditch CCC and use this. Force any AA, whether its SMAA, FXAA, or the more expensive ones. Force ambient occlusion (on games the app's dev supports). Adaptive Vsync or dynamic Vsync.

EightBit
Jan 7, 2006
I spent money on this line of text just to make the "Stupid Newbie" go away.

Endymion FRS MK1 posted:

This reminds me why I urge everybody with an AMD card to use RadeonPro. Seriously, ditch CCC and use this. Force any AA, whether its SMAA, FXAA, or the more expensive ones. Force ambient occlusion (on games the app's dev supports). Adaptive Vsync or dynamic Vsync.

Can you force ambient occlusion off? I hate the cheap way it adds black auras to everything.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
I just started using RadeonPro and have been enjoying the new features (mostly I just like having the direct ATI Tray Tools-style control that I lacked). I ditched the CCC entirely a long time ago though. RadeonPro's pretty cool, although it'd be nice if you didn't have to configure everything per-application.


I haven't really had any problems with AMD/ATI drivers to speak of, at least not since the 4xxx-era. I also had a few 8000-series Nvidia cards that were pretty cool too, although I usually buy AMD/ATI since I've used their cards for so long it makes it easier to know when and what to upgrade to.

NickelSkeleton
Jul 2, 2004

Put another nickel in.
I finally just got my replacement SAPPHIRE Radeon HD 7950 3GB w/Boost after my first one had a GPU failure. The card just stopped working during Bioshock Infinite.

Can you guys recommend any settings so I don't have to deal with that bullshit again? I read that manually turning up the fan in CCC before gaming is a good idea.

randyest
Sep 1, 2004

by R. Guyovich

NickelSkeleton posted:

I finally just got my replacement SAPPHIRE Radeon HD 7950 3GB w/Boost after my first one had a GPU failure. The card just stopped working during Bioshock Infinite.

Can you guys recommend any settings so I don't have to deal with that bullshit again? I read that manually turning up the fan in CCC before gaming is a good idea.
Is it normal to have to change stock settings to avoid destroying a video card? I know it's possible to bump up the voltage and clock rates to a point that can break something but I assumed cards running at stock rates would be ok. Is this not the case?

EightBit
Jan 7, 2006
I spent money on this line of text just to make the "Stupid Newbie" go away.

randyest posted:

Is it normal to have to change stock settings to avoid destroying a video card? I know it's possible to bump up the voltage and clock rates to a point that can break something but I assumed cards running at stock rates would be ok. Is this not the case?

It's pretty hard to tell what killed it though. Could have been a really unlucky neutrino/cosmic ray strike, could have been a faulty chip. GPU's typically handle heat far better than CPU's, and people freak out when their fans only start to spin up at 70C, which is fine for the stock clocks; if you are overclocking you need to set the fan to a higher setting, as the higher clocks and voltage make the chip more sensitive to heat-induced instability.

PC LOAD LETTER
May 23, 2005
WTF?!

randyest posted:

Is it normal to have to change stock settings to avoid destroying a video card?
Nope. Card manufacturer screwed up most likely. For some reason XFX is pretty failure prone this time around too.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

LCD Deathpanel posted:

RadeonPro's pretty cool, although it'd be nice if you didn't have to configure everything per-application.


You.. don't. Just click "Global" at the top of the window.

Aquila
Jan 24, 2003

Does anyone have a favorite card for linux workstations (ubuntu 12.04)? No 3d or gaming, just reliable graphics capable of driving two 2560x1440 or 2560x1600 monitors with a minimum of driver pain. One supplemental power connector is ok and passively cooled would be greatly preferred. Not excessive cost would be nice.

Goon Matchmaker
Oct 23, 2003

I play too much EVE-Online
nVidia.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Aquila posted:

Does anyone have a favorite card for linux workstations (ubuntu 12.04)? No 3d or gaming, just reliable graphics capable of driving two 2560x1440 or 2560x1600 monitors with a minimum of driver pain. One supplemental power connector is ok and passively cooled would be greatly preferred. Not excessive cost would be nice.

Unless I'm misinformed or misremembering, Intel's Linux drivers for Ivy bridge and Haswell are pretty decent. If the motherboard has the appropriate outputs, that would work.

Otherwise, literally just the cheapest GeForce or Quadro with the appropriate plugs, DL-DVI or DisplayPort.

Talaii
Sep 5, 2003

You crack me up, lil buddy!

Factory Factory posted:

Unless I'm misinformed or misremembering, Intel's Linux drivers for Ivy bridge and Haswell are pretty decent. If the motherboard has the appropriate outputs, that would work.

Otherwise, literally just the cheapest GeForce or Quadro with the appropriate plugs, DL-DVI or DisplayPort.

The problem is that intel IGPs don't have dual-link DVI - so you're looking for a motherboard with two displayports. As far as I'm aware, there's about one in existence - some extremely expensive Gigabyte board with dual thunderbolt. Easier and cheaper to just buy a low-end nvidia card with the appropriate outputs.

Haswell may change this with higher resolutions via HDMI, but from my own experience, I wouldn't trust the screens to actually run full resolution off HDMI even if the graphics card supports it - a lot of the 2560x1XXX screens will only let you do 1080p/1200p off HDMI.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

HalloKitty posted:

You.. don't. Just click "Global" at the top of the window.
Whoa. Thanks, it's been awhile since I used it previously (was all per-application at that point) so I hadn't thought to really check out the new settings.

testtubebaby
Apr 7, 2008

Where we're going,
we won't need eyes to see.


How do I completely remove Catalyst/AMD drivers from my Windows 7 machine? I ran through the uninstall utility, there doesn't seem to be any Catalyst/AMD installations on my computer, and yet AMD Driver Detect tells me that I have the latest drivers installed and I can still run Borderlands 2 @ 1080p with everything cranked.

The reason I want to remove everything is that I want to install an old version of Catalyst because I think it may be what is causing Bioshock Infinite to not even boot on my system (goes straight to black screen and then error message). It was working fine on the 13.3 beta, but started messing up with the new 13.6 beta.

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you

zenintrude posted:

How do I completely remove Catalyst/AMD drivers from my Windows 7 machine? I ran through the uninstall utility, there doesn't seem to be any Catalyst/AMD installations on my computer and yet AMD Driver Detect tells me that I have the latest drivers installed and I can still run Borderlands 2 @ 1080p with everything cranked.
You mean you ran the AMD Cleanup Util and it still left something on your computer?

If so wow, I've run that program probably 20 times and never had it leave anything because I'll reboot and be greated with a lovely 800x600 screen until I re-install. If those haven't worked you may want to try downloading DriverFusion and seeing if that can completely remove things.

testtubebaby
Apr 7, 2008

Where we're going,
we won't need eyes to see.


real_scud posted:

You mean you ran the AMD Cleanup Util and it still left something on your computer?

I just ran the uninstall for within Control Panel, not the AMD Cleanup Util... the AMD Cleanup Util cleared out everything, thanks.

That said, installed 13.3 Beta and Bioshock Infinite still gives me a black screen + error at startup. Now I have no idea why.

[edit] Apparently it has something to do with launching at full screen? Added "-windowed" to the Set Launch Options in Steam and now it boots fine, after which I can reset it to fullscreen in game... bizarre. Could this have something to do with the 2K/Irrational/AMD opening movies and if so is there a way to bypass them?

[edit 2] Disabled start up videos, still black screens at start up if I don't tell it to launch in windowed mode... bizarre.

testtubebaby fucked around with this message at 16:40 on Jun 9, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
13.3 beta? Unless that's a typo, you should update to 13.4 WHQL or 13.6 beta.

Aphrodite
Jun 27, 2006

He said in the first post he went back, because he thought 13.6 was causing his issues.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Oops. Well, that still leaves 13.4 stable.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
To be fair the problems started on a known-broken install of the drivers, so doing a cleanup and then reinstalling the latest Beta drivers is probably the smartest option.

Shimrra Jamaane
Aug 10, 2007

Obscure to all except those well-versed in Yuuzhan Vong lore.
Any idea when the 4gb GTX 770 cards are going to start hitting the market? On the EVGA website they have tons listed but I can't find any way to purchase them on any site. Is it even worth springing for the 4gb over the 2gb?

mikemelbrooks
Jun 11, 2012

One tough badass

Shimrra Jamaane posted:

Any idea when the 4gb GTX 770 cards are going to start hitting the market? On the EVGA website they have tons listed but I can't find any way to purchase them on any site. Is it even worth springing for the 4gb over the 2gb?

Unles you are planning to use three monitors at high resolution NO.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

zenintrude posted:

I just ran the uninstall for within Control Panel, not the AMD Cleanup Util... the AMD Cleanup Util cleared out everything, thanks.

That said, installed 13.3 Beta and Bioshock Infinite still gives me a black screen + error at startup. Now I have no idea why.

[edit] Apparently it has something to do with launching at full screen? Added "-windowed" to the Set Launch Options in Steam and now it boots fine, after which I can reset it to fullscreen in game... bizarre. Could this have something to do with the 2K/Irrational/AMD opening movies and if so is there a way to bypass them?

[edit 2] Disabled start up videos, still black screens at start up if I don't tell it to launch in windowed mode... bizarre.
Uninstall the drivers with control panel. Completely remove everything AMD/ATI (GPU-related) with Driver Fusion. Reboot, and run Driver Fusion again with the same settings. If you're using Afterburner, uninstall it and remove the AB directory. Old versions of AB used the unofficial overclocking mode which can cause problems with Bioshock Infinite as I found out, but the new ones allow for unlocked official overclocking (use the /XCL switch method to unlock).

Then reinstall the newest beta drivers & reinstall the latest AB beta if needed (using the /XCL switch for extended overclocking stuff). Make sure your videocard is free of dust.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

mikemelbrooks posted:

Unles you are planning to use three monitors at high resolution NO.
If you're spending $300+ on a graphics card it drat well better have at least 3GB of RAM. Since nVidia doesn't make 3GB cards, that leaves you with the somewhat unfortunate choice of picking between cards with more RAM than you'll likely ever need, or cards without enough VRAM for their expected service life. The reality is that if you buy a 2GB GTX 770, you'll probably retire the card because it doesn't have enough VRAM long before it's too slow or no longer has driver support.

Magic Underwear
May 14, 2003


Young Orc

Alereon posted:

If you're spending $300+ on a graphics card it drat well better have at least 3GB of RAM. Since nVidia doesn't make 3GB cards, that leaves you with the somewhat unfortunate choice of picking between cards with more RAM than you'll likely ever need, or cards without enough VRAM for their expected service life. The reality is that if you buy a 2GB GTX 770, you'll probably retire the card because it doesn't have enough VRAM long before it's too slow or no longer has driver support.

First off, nvidia does make 3gb cards, the 780. Second, we learned from the 6XX generation that 2gb is pretty much fine at normal resolutions. Skyrim with ultra texture mods can go beyond 2gb, but almost nothing else can. The other thing is that most games are very lazy about vram. They might allocate a lot of it but they don't really use it. In other words, a game might show 2.5gb used on a 4gb card but will perform exactly the same as the 2gb variant.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Except that 2560x1440 monitors have become affordable and more common thanks to the Korean imports, and we're about to hit a new console generation with vastly greater amounts of RAM available than last generation's consoles. It's not guaranteed that 2GB will make a 770 obsolete, but it's silly to say it's not a possibility, either. We're talking a console jump from 512 MB RAM/VRAM total to at least 5 GB. The 770 could easily end up with far more shader/compute power than the console, but have less VRAM than the game is coded to use at 1080p.

Factory Factory fucked around with this message at 20:28 on Jun 9, 2013

Shimrra Jamaane
Aug 10, 2007

Obscure to all except those well-versed in Yuuzhan Vong lore.
I guess I'll wait for the 4gb models to go on sale. Should only be another week I hope.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I'll just quote Anandtech's GTX 770 review on this topic, emphasis mine:

Anandtech's GTX 770 review posted:

Not unlike where we are with 1GB/2GB on mainstream ($150+) cards, we’re at a similar precipice with these enthusiast class cards. Having 2GB of RAM doesn’t impose any real problems today, but I’m left to wonder for how much longer that’s going to be true. The wildcard in all of this will be the next-generation consoles, each of which packs 8GB of RAM, which is quite a lot of RAM for video operations even after everything else is accounted for. With most PC games being ports of console games, there’s a decent risk of 2GB cards being undersized when used with high resolutions and the highest quality art assets. The worst case scenario is only that these highest quality assets may not be usable at playable performance, but considering the high performance of every other aspect of GTX 770 that would be a distinct and unfortunate bottleneck.

The solution for better or worse is doubling the GTX 770 to 4GB. GTX 770 is capable of housing 4GB, and NVIDIA’s partners will be selling 4GB cards in the near future, so 4GB cards will at least be an option. The price premium for 4GB of RAM looks to be around $20-$30, and I expect that will come down some as 4Gb chips start to replace 2Gb chips. 4GB would certainly make the GTX 770 future-proof in that respect, and I suspect it’s a good idea for anyone on a long upgrade cycle, but as always this is a bit of a gamble.

Though I can’t help but feel NVIDIA could have simply sidestepped the whole issue by making 4GB the default, rather than an optional upgrade. As it stands 2GB feels shortsighted, and for a $400 card, a bit small. Given the low cost of additional RAM, a 4GB baseline likely would have been bearable.

Magic Underwear
May 14, 2003


Young Orc

It's a completely valid point. But you made it sound like you'd be an idiot to get any mid-high end card with 2gb, which is not true. We really don't know what is going to happen with vram with the new consoles. They are still running at 1080p, after all.

If you're going to stick with the same card for a very long time or jump into very high resolutions, by all means get the 4gb. But the 770 is a beast no matter how much vram it has.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

Magic Underwear posted:

They are still running at 1080p, after all.

Uh, no. Games that are developed to run at 1080p will do so, but the new consoles will support 4K. So yeah, you'd be an idiot to buy a 2GB high-end card today.

edit: Nevermind on the PS4, only the Xbox One will support 4K games.

InstantInfidel fucked around with this message at 21:52 on Jun 9, 2013

Wiggly Wayne DDS
Sep 11, 2010



InstantInfidel posted:

Uh, no. Games that are developed to run at 1080p will do so, but the new consoles will support 4K. So yeah, you'd be an idiot to buy a 2GB high-end card today.

edit: Nevermind on the PS4, only the Xbox One will support 4K games.
That's because they're using HDMI 1.4a which supports 4K at 30hz, not because the system is powerful enough to handle a game rendered at that size.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Oh, I dunno, a lot of low-impact indie games could be rendered at native rez instead of upscaled. That'd be pretty cool.

Disgustipated
Jul 28, 2003

Black metal ist krieg

InstantInfidel posted:

Uh, no. Games that are developed to run at 1080p will do so, but the new consoles will support 4K. So yeah, you'd be an idiot to buy a 2GB high-end card today.

edit: Nevermind on the PS4, only the Xbox One will support 4K games.
The PS4 has a GPU that's 50% faster than the Xbone's. The PS4 supports 4K output, they're just not pretending it's powerful enough to actually play games at that res. No way we see 4K games this generation, it'll be a loving miracle if all games run at 1080p.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

InstantInfidel posted:

edit: Nevermind on the PS4, only the Xbox One will support 4K games.

No games will run in 4K unless they are extremely simplistic ones. The PS4 has a far faster GPU than the Xbox One, for a start.

But we're still talking about the PS4 being somewhere between a 7850 and a 7870, and the Xbox One being something more like a 7790++

Adbot
ADBOT LOVES YOU

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT

zylche posted:

That's because they're using HDMI 1.4a which supports 4K at 30hz, not because the system is powerful enough to handle a game rendered at that size.


Disgustipated posted:

The PS4 has a GPU that's 50% faster than the Xbone's. The PS4 supports 4K output, they're just not pretending it's powerful enough to actually play games at that res. No way we see 4K games this generation, it'll be a loving miracle if all games run at 1080p.


HalloKitty posted:

No games will run in 4K unless they are extremely simplistic ones. The PS4 has a far faster GPU than the Xbox One, for a start.

But we're still talking about the PS4 being somewhere between a 7850 and a 7870, and the Xbox One being something more like a 7790++

The Xbox 360, technology from 2005, manages to run Skyrim at 1080p at 30FPS. It's also silly and incorrect to expect hardware to perform identically on two things as different as a PC and a game console. A 7790 that doesn't have to worry about the background bullshit from Windows and hacked-together drivers and can put a lot more of its GPU power towards being, you know, a GPU.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply