Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Agreed posted:

This benchmark is garbage! They only test the 290x and the 780Ti in situations where the 290X has a clear advantage, like raising the resolution, or enabling more features! Coming from that, you'd get the opinion that for about twice the noise, you can get pretty much the same performance, or better slightly, for nearly $300 less!

Oh wait that's accurate poo poo gently caress

Don't feel bad. I'm still planning on joining you in the 780ti owners club.

Adbot
ADBOT LOVES YOU

BurritoJustice
Oct 9, 2012

Agreed posted:

This benchmark is garbage! They only test the 290x and the 780Ti in situations where the 290X has a clear advantage, like raising the resolution, or enabling more features! Coming from that, you'd get the opinion that for about twice the noise, you can get pretty much the same performance, or better slightly, for nearly $300 less!

Oh wait that's accurate poo poo gently caress

Doesn't take into account the 780ti's 25-35% overclocking headroom with the stock cooler, versus the 290x's pretty much negative overclocking headroom (it gets slower) with the stock cooler at any reasonable noise level. So don't feel too bad? :gbsmith:

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

BurritoJustice posted:

Doesn't take into account the 780ti's 25-35% overclocking headroom with the stock cooler, versus the 290x's pretty much negative overclocking headroom (it gets slower) with the stock cooler at any reasonable noise level. So don't feel too bad? :gbsmith:

But you could get a cooler for the 290x and still come out $200+ ahead. v:shobon:v Though shadowplay and the assurances that Green Light allows sure are nice.

Byolante
Mar 23, 2008

by Cyrano4747

Ghostpilot posted:

But you could get a cooler for the 290x and still come out $200+ ahead. v:shobon:v Though shadowplay and the assurances that Green Light allows sure are nice.

I can get a 780ti for the same price as a 290x and a aftermarket cooler here in lovely sunny Australia, home of the why, because gently caress you thats why technology tax.

CactusWeasle
Aug 1, 2006
It's not a party until the bomb squad says it is
If I'm replacing a 7950 with a 780, what do I need to remove/uninstall to avoid problems with driver conflicts?

Going to order an EVGA 780 Superclocked ACX. The Classified is £40~ more expensive but doesn't seem to be worth the extra.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Byolante posted:

I can get a 780ti for the same price as a 290x and a aftermarket cooler here in lovely sunny Australia, home of the why, because gently caress you thats why technology tax.

You should think of it in terms of not having to settle for a crappier product just because it's so much cheaper as to be worthwhile. It's really a blessing in disguise.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

CactusWeasle posted:

If I'm replacing a 7950 with a 780, what do I need to remove/uninstall to avoid problems with driver conflicts?

Going to order an EVGA 780 Superclocked ACX. The Classified is £40~ more expensive but doesn't seem to be worth the extra.

I believe the general graphics driver cleaning setup from AMD is to use their uninstaller first and then use Driver Sweeper after that to get whatever is left over. That's what I've done when I've needed to clean out AMD drivers.

Haeleus
Jun 30, 2009

He made one fatal slip when he tried to match the ranger with the big iron on his hip.
Very satisfied with my EVGA 780 acx so far, running Battlefield 4 at 100fps on ultra with high FXAA (get drops to around 70 occasionally with MXAA so I'm still deciding if its worth the fps hit). Funny how BF4 is running so much better than the horribly optimized AC4 that after all came with the card.

I've managed to hit a stable 1200Mhz core clock with Precision X, maybe I'll start fiddling with the memory clock next.

Byolante
Mar 23, 2008

by Cyrano4747

Haeleus posted:

Very satisfied with my EVGA 780 acx so far, running Battlefield 4 at 100fps on ultra with high FXAA (get drops to around 70 occasionally with MXAA so I'm still deciding if its worth the fps hit). Funny how BF4 is running so much better than the horribly optimized AC4 that after all came with the card.

I've managed to hit a stable 1200Mhz core clock with Precision X, maybe I'll start fiddling with the memory clock next.

Are you running at 1080p or 1440p?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Here's a thought for an alternative to the NZXT Kraken G10: Some folks in SA-mart are offering steelcutting services with a Goon discount. If anyone's good at CAD, you can make up a GPU bracket and get it made.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Byolante posted:

Are you running at 1080p or 1440p?
Those sound like 1080p numbers, judging by the relative performance of my 670. I play BF4 on 1080p by choice because in multiplayer there's no real advantage to 1440p on a 27" monitor. All your targets are much smaller in exchange for moderately higher sharpness.

Haeleus
Jun 30, 2009

He made one fatal slip when he tried to match the ranger with the big iron on his hip.

Byolante posted:

Are you running at 1080p or 1440p?

Just 1080p.

movax
Aug 30, 2008

Agreed posted:

This benchmark is garbage! They only test the 290x and the 780Ti in situations where the 290X has a clear advantage, like raising the resolution, or enabling more features! Coming from that, you'd get the opinion that for about twice the noise, you can get pretty much the same performance, or better slightly, for nearly $300 less!

Oh wait that's accurate poo poo gently caress

But but better drivers :smith:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

movax posted:

But but better drivers :smith:

I have no regrets, G-Sync gonna kill :rock:

(AMD-Sync needs to hit ASAP, though, once people actually see it I'm certain skeptics will be like WOAHLY gently caress that is COOL AS poo poo after all :unsmith:)

((parenthetical thought bubble two: AMD and nVidia need to drop the bullshit and get together on hammering out a compliant methodology for fixing D3D's issues. OpenGL has way, way fewer issues but the last OpenGL game to try to do anything cool was RAGE and it launched to, deservedly, no real acclaim and had probably the worst lighting and object texture issues of any game I've played that was made after Doom 3))

(((I really ought to just put these in their own little sentences, it's not like they don't apply. It's probably a total sin against grammar too, let alone the weird sotto voce quality it lends to what are just normal thoughts without any particular need to be "asides" - well, perhaps with the exception of this one, but now I'm losing myself navel gazing into the meta mirror of posting about posting)))

(((( <--- hehe that looks like two butts))))

Edit: I do think the 780Ti wins the overclocking contest on average, which is pretty impressive for a last-gen design, but at the same time what the hell would you expect out of a fully enabled 7.1 billion transistor GPU aggressively binned for lower voltage operation at higher frequencies? Before someone says it's disingenuous to call it a last-gen design, I will definitely acknowledge that they are not charging last-gen prices for it. And nVidia is also being more forward in terms of adding support for API features, well, except for that one little Mantle thing... :shepface:

Agreed fucked around with this message at 19:43 on Dec 29, 2013

Wistful of Dollars
Aug 25, 2009

Agreed posted:

I have no regrets, G-Sync gonna kill :rock:

(AMD-Sync needs to hit ASAP, though, once people actually see it I'm certain skeptics will be like WOAHLY gently caress that is COOL AS poo poo after all :unsmith:)

((parenthetical thought bubble two: AMD and nVidia need to drop the bullshit and get together on hammering out a compliant methodology for fixing D3D's issues. OpenGL has way, way fewer issues but the last OpenGL game to try to do anything cool was RAGE and it launched to, deservedly, no real acclaim and had probably the worst lighting and object texture issues of any game I've played that was made after Doom 3))

(((I really ought to just put these in their own little sentences, it's not like they don't apply. It's probably a total sin against grammar too, let alone the weird sotto voce quality it lends to what are just normal thoughts without any particular need to be "asides" - well, perhaps with the exception of this one, but now I'm losing myself navel gazing into the meta mirror of posting about posting)))

(((( <--- hehe that looks like two butts))))

Edit: I do think the 780Ti wins the overclocking contest on average, which is pretty impressive for a last-gen design, but at the same time what the hell would you expect out of a fully enabled 7.1 billion transistor GPU aggressively binned for lower voltage operation at higher frequencies? Before someone says it's disingenuous to call it a last-gen design, I will definitely acknowledge that they are not charging last-gen prices for it. And nVidia is also being more forward in terms of adding support for API features, well, except for that one little Mantle thing... :shepface:

My untimely discovery of crossfire's windowed mode issues has actually swung me back to nvidia. I might get a pair of 780s to replace my unlocked 290s. It's a step down in overall power but with the high prices for 290s I'll at worst break even and will probably make a profit.

G-sync is just icing on the cake.

Would be willing to make goons a good deal.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
SLI doesn't work in windowed mode either.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Factory Factory posted:

SLI doesn't work in windowed mode either.

It has for a while for most titles, it does take a performance hit though.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Agreed posted:

(parenthetical thought bubble two: AMD and nVidia need to drop the bullshit and get together on hammering out a compliant methodology for fixing D3D's issues. OpenGL has way, way fewer issues but the last OpenGL game to try to do anything cool was RAGE and it launched to, deservedly, no real acclaim and had probably the worst lighting and object texture issues of any game I've played that was made after Doom 3))
they can't, Microsoft has to and has shown no interest in doing such.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Professor Science posted:

they can't, Microsoft has to and has shown no interest in doing such.

So what's the consumer equivalent of torches and pitchforks to Microsoft's castle? :mad:

I know the reality is far more complex than just "hey guys work together and we can overcome! :unsmith:," it's just really frustrating seeing such a well understood problem persist, and persist, and persist... For no good reason other than MS can't especially be arsed to implement some really not earth-shattering changes, just some very important and useful changes period, in a way that promotes rather than restricts adoption. Give developers situations where the GPU's computational power can be the bottleneck and they won't have to resort to using neat tricks to work out the card's logic, we could see some fairly profoundly different basic approaches to very large scenes, a total shift in what the idea of a very large scene even means.

It is ridiculous that OpenGL is kinda lapping Microsoft right now in terms of putting features that developers could sure use. Uggh. Even a best guess at utilization would be a hell of a lot better than basically the world's most resource intensive idle loop with the CPU just being clueless and constantly updating for largely static object states and limiting draw call and batch ops, generally for no better reason than "ehhh...," and on top of that, Microsoft has the balls to come out and say (retracted or not!) that DX11 is mostly feature complete? Come onnnnnn :mad:

EkardNT
Mar 31, 2011
Speaking of Mantle, in realistic terms how significant a performance will it likely end up being for AMD cards vs NVIDIA? I've seen hyperbole up to and including predictions that it will be so revolutionary you won't be able to sell a 780Ti for :10bux:, but I haven't seen a calm analysis of the degree to which it will benefit AMD.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Agreed posted:

So what's the consumer equivalent of torches and pitchforks to Microsoft's castle? :mad:
(wooh long post)

Let's be real--even though OGL is doing some stuff, it still sucks. There's pretty much nothing revolutionary going on, it's minor improvements at best and further evidence that nobody has a clue what the actual successor to GL will be. (or more specifically, the successor to the classic GL pipeline model.)

If you put your Remembering Hats on and think back to 2007 or 2008, Larrabee actually tried to do something about this. We can mock Larrabee all we want ("it's what happens when software guys design hardware" is what I heard a lot at the time, and it's pretty true), but I will give them props for trying something to get beyond the standard pipeline. (if you don't actually know much about Larrabee, go read everything linked here)

Larrabee failed for two reasons:

1. it sucked tremendously at being a D3D/OGL device. A big part of this was their own naivete ("pfft we don't need a ROP, we'll do it in software"). I think Forsyth in his talk at Stanford mentions that they have a large number of different pipelines implemented for different apps that all have different performance, with no way to tell a priori which way would be fastest. The software costs would be astronomical.

2. it didn't get a console win. lots of reasons for this (Intel being largely indifferent, everybody being really gunshy about exotic architectures after Cell, really high power consumption), but Larrabee only had a chance at doing something interesting if it could get a console win in order to gain widespread developer interest/acceptance/traction.

okay, I keep talking about "doing something interesting." what am I talking about? if you go back and look at the graphics pipeline throughout history, it hasn't changed all that much. sure, we've added pixel, vertex, geometry, and now compute shaders in both D3D and OGL. there's tessellation too. lots of new stuff! but there's still fundamentally a pretty straightforward pipeline that ends at the ROP. you can insert interesting things in various places (and people definitely do, see the siggraph papers any given year), but nobody is able to build any sort of arbitrary pipelines with reasonable efficiency. Larrabee's goal was to be able to throw all of the existing model out the window and let sufficiently smart developers implement whatever. want some sparse voxel octree nonsense? sure. micropolys everywhere? also okay. something really weird? yeah fine.

(for more on this, read Kayvon Fatahalion's dissertation or his SIGGRAPH talk. actually, just read everything he writes. he's one of Pat Hanrahan's former PhD students, like the one guy that invented CUDA at NVIDIA and the other guy that was one of the main GCN architects, and he is ludicrously smart.)

similarly: in the GPU compute realm, nobody's figured out anything to do with the ROP. it is a vestigial weird graphics thing that has no interface that looks anything like a normal programming language and nobody knows how to expose it. if somebody did figure out how to expose it, you could probably end up writing reasonable graphics pipelines in CUDA/OCL/something else. but nobody has, and now that CUDA is purely focused on HPC and OpenCL is focused (insofar as OpenCL has ever been focused at all) on mobile, I don't know that anyone will. (well OCL won't, the CPU and DSP guys won't let the GPU guys enable specialized hardware that they can't emulate quickly)

obviously, Microsoft could improve their driver model without addressing these issues (as Mantle tries to do), but there's little reason for them to do so. despite developers whining to the contrary, their driver model is largely fine for its intended use, and it's not clear that existing hardware could support anything better.

so if MS doesn't care right now, who's left? Apple could do something Mantle-like on iOS, but they won't until they ship only Rogue-based platforms for a long time (also not sure if it even makes sense there). I doubt they really care on desktop based on their rate of GL version adoption. maybe Android at some point.

Straker
Nov 10, 2005
Holy crap got stuck in another immediate, no-legitimate-explanation BSOD on boot scenario after playing BF4 for about two minutes. I spent about three seconds trying to fix it and just uninstalled virtu instead, that fixed it immediately, gently caress virtu. My igpu is still disabled from the previous drama anyway so virtu is totally useless to me instead of mostly useless.

Still, this is weird, I've had all sorts of monitor configurations on the same setup for 3 years, the only difference is now I have two AMD cards instead of one, anyone else ever have driver issues as crazy as this?

edit: well, speaking of Windows 95-style bullshit, I got stuck in a BSOD boot loop for a third time! This time I resolved it by removing my TV tuner... it's also an AMD/ATI card so there may be some driver overlap/interference or something, I recalled seeing some windows notification about the thing having no driver sometime when I was messing around trying to fix the original BSOD problem, and was like hey that's another video thing, may as well remove it and see what happens. I haven't seen anything this loving ridiculous since XP, hopefully this finally fixes it.

Straker fucked around with this message at 09:21 on Dec 30, 2013

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

El Scotch posted:

G-sync is just icing on the cake.

G-Sync definitely doesn't work in windowed mode.

deimos posted:

It has for a while for most titles, it does take a performance hit though.

That's a lot of caveats. Any recent sources or benchmarks you know of? I was having a look around but I couldn't see anything specifically about this.

HalloKitty fucked around with this message at 10:17 on Dec 30, 2013

BurritoJustice
Oct 9, 2012

EkardNT posted:

Speaking of Mantle, in realistic terms how significant a performance will it likely end up being for AMD cards vs NVIDIA? I've seen hyperbole up to and including predictions that it will be so revolutionary you won't be able to sell a 780Ti for :10bux:, but I haven't seen a calm analysis of the degree to which it will benefit AMD.

DICE says 10% increased development schedule for up to 20% GPU performance increase.

sigher
Apr 22, 2008

My guiding Moonlight...



Is it possible to get Shadowplay working with emulators?

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

HalloKitty posted:

That's a lot of caveats. Any recent sources or benchmarks you know of? I was having a look around but I couldn't see anything specifically about this.

No, I just remember reading they got it working circa 2011 and seeing some amateur benchmarks for it.

Phuzun
Jul 4, 2007

Well I had both my GTX 570 and 780 installed with the idea that I'd keep the 570 folding 24/7 or have it for Physx games, but I ran into several issues and pulled it back out of this machine. First, folding@home does not use the normal index for the GPUs... but it does for CUDA/OpenCL indexes. This results in all kinds of fuckery with the config.xml to find which settings will get them to fold, but then I ran into issues with GPUs folding, and the work units just disappearing when pausing them. Second, the waterblocks don't line up (pretty much expected this), so the water was looping through the 780 before hitting the restrictive 570 block versus having the flow split between them. This resulted in adding about 15-20C onto my CPU and 780 temperatures.

While there was some performance improvement from the dedicated Physx card, it was insignificant to the raw power of the 780. Batman origins already runs maxed out in the 60-90fps range and with the dedicated Physx card, it gained around 10fps. I'm going to throw that 570 into another machine at this point and just work on overclocking/BIOS tweaking the 780.

e: on the topic of Shadowplay. Has anyone encountered an issue where the audio is not captured. Or if it is captured, it is only the rear/surround audio? Figured this out. Asus Xonar DSX has an option called GX and this is what was causing audio to go missing in Shadowplay.

Phuzun fucked around with this message at 17:09 on Dec 30, 2013

sedaps
Jan 29, 2004
I just purchased 2 X-Star DP2710 monitors hoping for a dual setup with both running 1440. Unfortunately I only have one DVI port on my Radeon 7870. I was about to just get a Mini-DP to DVI adapter that's about $100, but then I thought what if I get a new card?

So I'm seeing all the GTX 660/670/680 cards have dual DVI outputs. Is it worth it for me to upgrade from Radeon 7870? Or should I buy an adapter for $100 now and then wait for a more significant bang for my buck a year down the road.

Phuzun
Jul 4, 2007

sedaps posted:

I just purchased 2 X-Star DP2710 monitors hoping for a dual setup with both running 1440. Unfortunately I only have one DVI port on my Radeon 7870. I was about to just get a Mini-DP to DVI adapter that's about $100, but then I thought what if I get a new card?

So I'm seeing all the GTX 660/670/680 cards have dual DVI outputs. Is it worth it for me to upgrade from Radeon 7870? Or should I buy an adapter for $100 now and then wait for a more significant bang for my buck a year down the road.

Would these cables not work for your card/display? They are much cheaper than $100.
http://www.monoprice.com/Category?c_id=102&cp_id=10246&cs_id=1024604

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!
Or get one of the short adapters like they sell for Mac laptops.

sedaps
Jan 29, 2004

Phuzun posted:

Would these cables not work for your card/display? They are much cheaper than $100.
http://www.monoprice.com/Category?c_id=102&cp_id=10246&cs_id=1024604

I think those are single link DVI. This is the monoprice version of what I need, although the reviews are not so great on them from some research.

Phuzun
Jul 4, 2007

sedaps posted:

I think those are single link DVI. This is the monoprice version of what I need, although the reviews are not so great on them from some research.

Yeah, looking at them now, they do appear to be single link. You had mentioned the GTX 600 series, but AMD also has cards that feature 2 dual-link DVI outputs, including other 7870s. Instead of buying an adapter, you could get a second 7870 for the extra performance and even have a third DVI output for later.

Wistful of Dollars
Aug 25, 2009

HalloKitty posted:

G-Sync definitely doesn't work in windowed mode.

That's good to know. That said, I would probably suck it up and play full screen to use it.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Professor Science posted:

(if you don't actually know much about Larrabee, go read everything linked here)

GPUs don't work like they think you do: the link

The_Franz
Aug 8, 2003

BurritoJustice posted:

DICE says 10% increased development schedule for up to 20% GPU performance increase.

Those figures aren't AMD vs Nvidia, but rather Mantle vs D3D. Once the API is actually publicly released we should be able to see some benchmark comparisons between an optimized AMD path with Mantle and an OpenGL path that uses the Nvidia extensions.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Straker posted:

Holy crap got stuck in another immediate, no-legitimate-explanation BSOD on boot scenario after playing BF4 for about two minutes. I spent about three seconds trying to fix it and just uninstalled virtu instead, that fixed it immediately, gently caress virtu. My igpu is still disabled from the previous drama anyway so virtu is totally useless to me instead of mostly useless.

Still, this is weird, I've had all sorts of monitor configurations on the same setup for 3 years, the only difference is now I have two AMD cards instead of one, anyone else ever have driver issues as crazy as this?

edit: well, speaking of Windows 95-style bullshit, I got stuck in a BSOD boot loop for a third time! This time I resolved it by removing my TV tuner... it's also an AMD/ATI card so there may be some driver overlap/interference or something, I recalled seeing some windows notification about the thing having no driver sometime when I was messing around trying to fix the original BSOD problem, and was like hey that's another video thing, may as well remove it and see what happens. I haven't seen anything this loving ridiculous since XP, hopefully this finally fixes it.

As pointed out to me by FF, you're not the only one experiencing some serious problems with the unbelievably buggy state of BF4. Stockholders are too!

P.S. first production proven Mantle patch now moved to January due to lawsuits, welp

Agreed fucked around with this message at 20:56 on Dec 30, 2013

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
I was gonna wait for the 800 series but im just too tempted. Ordered an MSI 760 twin frozr. Looks like the coolest card that will fit in an SG05.

Purgatory Glory
Feb 20, 2005
Have any of the reviewers with Gsync monitors tried SLI with it yet? Seems pretty game changing if micro-stuttering is eliminated or reduced greatly with it.

Hot Stunt
Oct 2, 2009



Thanks to cryptocurrency mining I was able to sell my six month old 7950 for $50 more than I paid for it. Upgraded to an R9-290. Spelunky and Papers, Please are running super smooth.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Purgatory Glory posted:

Have any of the reviewers with Gsync monitors tried SLI with it yet? Seems pretty game changing if micro-stuttering is eliminated or reduced greatly with it.

From nVidia's G-Sync FAQ:

quote:

Q: How does NVIDIA G-SYNC work with SLI?

A: The NVIDA GPU connected to the display manages G-SYNC. SLI GPU setups work seamlessly with G-SYNC displays.

I'm not sure what overhead there would be associated with managing the display output, but I would imagine that it isn't very much considering most of the heavy lifting is done by the scaler replacement FPGA as opposed to the GPU. nVidia has emphasized frame pacing and coherence over raw additional card scaling for generations. We won't know anything solid beyond the previously stated performance hit for G-Sync's super duper frame pacing, but I don't think it would be anything higher, proportionately, for two cards versus one, especially since only one card is having to do the timing stuff for the monitor.

Still, this isn't church, wait for reviews before making an assessment.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply