|
Agreed posted:This benchmark is garbage! They only test the 290x and the 780Ti in situations where the 290X has a clear advantage, like raising the resolution, or enabling more features! Coming from that, you'd get the opinion that for about twice the noise, you can get pretty much the same performance, or better slightly, for nearly $300 less! Don't feel bad. I'm still planning on joining you in the 780ti owners club.
|
# ? Dec 29, 2013 08:22 |
|
|
# ? May 9, 2024 07:23 |
|
Agreed posted:This benchmark is garbage! They only test the 290x and the 780Ti in situations where the 290X has a clear advantage, like raising the resolution, or enabling more features! Coming from that, you'd get the opinion that for about twice the noise, you can get pretty much the same performance, or better slightly, for nearly $300 less! Doesn't take into account the 780ti's 25-35% overclocking headroom with the stock cooler, versus the 290x's pretty much negative overclocking headroom (it gets slower) with the stock cooler at any reasonable noise level. So don't feel too bad?
|
# ? Dec 29, 2013 11:40 |
|
BurritoJustice posted:Doesn't take into account the 780ti's 25-35% overclocking headroom with the stock cooler, versus the 290x's pretty much negative overclocking headroom (it gets slower) with the stock cooler at any reasonable noise level. So don't feel too bad? But you could get a cooler for the 290x and still come out $200+ ahead. vv Though shadowplay and the assurances that Green Light allows sure are nice.
|
# ? Dec 29, 2013 13:09 |
|
Ghostpilot posted:But you could get a cooler for the 290x and still come out $200+ ahead. vv Though shadowplay and the assurances that Green Light allows sure are nice. I can get a 780ti for the same price as a 290x and a aftermarket cooler here in lovely sunny Australia, home of the why, because gently caress you thats why technology tax.
|
# ? Dec 29, 2013 14:41 |
|
If I'm replacing a 7950 with a 780, what do I need to remove/uninstall to avoid problems with driver conflicts? Going to order an EVGA 780 Superclocked ACX. The Classified is £40~ more expensive but doesn't seem to be worth the extra.
|
# ? Dec 29, 2013 14:44 |
|
Byolante posted:I can get a 780ti for the same price as a 290x and a aftermarket cooler here in lovely sunny Australia, home of the why, because gently caress you thats why technology tax. You should think of it in terms of not having to settle for a crappier product just because it's so much cheaper as to be worthwhile. It's really a blessing in disguise.
|
# ? Dec 29, 2013 15:10 |
|
CactusWeasle posted:If I'm replacing a 7950 with a 780, what do I need to remove/uninstall to avoid problems with driver conflicts? I believe the general graphics driver cleaning setup from AMD is to use their uninstaller first and then use Driver Sweeper after that to get whatever is left over. That's what I've done when I've needed to clean out AMD drivers.
|
# ? Dec 29, 2013 16:27 |
|
Very satisfied with my EVGA 780 acx so far, running Battlefield 4 at 100fps on ultra with high FXAA (get drops to around 70 occasionally with MXAA so I'm still deciding if its worth the fps hit). Funny how BF4 is running so much better than the horribly optimized AC4 that after all came with the card. I've managed to hit a stable 1200Mhz core clock with Precision X, maybe I'll start fiddling with the memory clock next.
|
# ? Dec 29, 2013 17:41 |
|
Haeleus posted:Very satisfied with my EVGA 780 acx so far, running Battlefield 4 at 100fps on ultra with high FXAA (get drops to around 70 occasionally with MXAA so I'm still deciding if its worth the fps hit). Funny how BF4 is running so much better than the horribly optimized AC4 that after all came with the card. Are you running at 1080p or 1440p?
|
# ? Dec 29, 2013 17:48 |
|
Here's a thought for an alternative to the NZXT Kraken G10: Some folks in SA-mart are offering steelcutting services with a Goon discount. If anyone's good at CAD, you can make up a GPU bracket and get it made.
|
# ? Dec 29, 2013 18:06 |
|
Byolante posted:Are you running at 1080p or 1440p?
|
# ? Dec 29, 2013 18:06 |
|
Byolante posted:Are you running at 1080p or 1440p? Just 1080p.
|
# ? Dec 29, 2013 18:24 |
|
Agreed posted:This benchmark is garbage! They only test the 290x and the 780Ti in situations where the 290X has a clear advantage, like raising the resolution, or enabling more features! Coming from that, you'd get the opinion that for about twice the noise, you can get pretty much the same performance, or better slightly, for nearly $300 less! But but better drivers
|
# ? Dec 29, 2013 19:11 |
|
movax posted:But but better drivers I have no regrets, G-Sync gonna kill (AMD-Sync needs to hit ASAP, though, once people actually see it I'm certain skeptics will be like WOAHLY gently caress that is COOL AS poo poo after all ) ((parenthetical thought bubble two: AMD and nVidia need to drop the bullshit and get together on hammering out a compliant methodology for fixing D3D's issues. OpenGL has way, way fewer issues but the last OpenGL game to try to do anything cool was RAGE and it launched to, deservedly, no real acclaim and had probably the worst lighting and object texture issues of any game I've played that was made after Doom 3)) (((I really ought to just put these in their own little sentences, it's not like they don't apply. It's probably a total sin against grammar too, let alone the weird sotto voce quality it lends to what are just normal thoughts without any particular need to be "asides" - well, perhaps with the exception of this one, but now I'm losing myself navel gazing into the meta mirror of posting about posting))) (((( <--- hehe that looks like two butts)))) Edit: I do think the 780Ti wins the overclocking contest on average, which is pretty impressive for a last-gen design, but at the same time what the hell would you expect out of a fully enabled 7.1 billion transistor GPU aggressively binned for lower voltage operation at higher frequencies? Before someone says it's disingenuous to call it a last-gen design, I will definitely acknowledge that they are not charging last-gen prices for it. And nVidia is also being more forward in terms of adding support for API features, well, except for that one little Mantle thing... Agreed fucked around with this message at 19:43 on Dec 29, 2013 |
# ? Dec 29, 2013 19:40 |
|
Agreed posted:I have no regrets, G-Sync gonna kill My untimely discovery of crossfire's windowed mode issues has actually swung me back to nvidia. I might get a pair of 780s to replace my unlocked 290s. It's a step down in overall power but with the high prices for 290s I'll at worst break even and will probably make a profit. G-sync is just icing on the cake. Would be willing to make goons a good deal.
|
# ? Dec 30, 2013 04:18 |
|
SLI doesn't work in windowed mode either.
|
# ? Dec 30, 2013 04:41 |
|
Factory Factory posted:SLI doesn't work in windowed mode either. It has for a while for most titles, it does take a performance hit though.
|
# ? Dec 30, 2013 04:44 |
|
Agreed posted:(parenthetical thought bubble two: AMD and nVidia need to drop the bullshit and get together on hammering out a compliant methodology for fixing D3D's issues. OpenGL has way, way fewer issues but the last OpenGL game to try to do anything cool was RAGE and it launched to, deservedly, no real acclaim and had probably the worst lighting and object texture issues of any game I've played that was made after Doom 3))
|
# ? Dec 30, 2013 04:57 |
|
Professor Science posted:they can't, Microsoft has to and has shown no interest in doing such. So what's the consumer equivalent of torches and pitchforks to Microsoft's castle? I know the reality is far more complex than just "hey guys work together and we can overcome! ," it's just really frustrating seeing such a well understood problem persist, and persist, and persist... For no good reason other than MS can't especially be arsed to implement some really not earth-shattering changes, just some very important and useful changes period, in a way that promotes rather than restricts adoption. Give developers situations where the GPU's computational power can be the bottleneck and they won't have to resort to using neat tricks to work out the card's logic, we could see some fairly profoundly different basic approaches to very large scenes, a total shift in what the idea of a very large scene even means. It is ridiculous that OpenGL is kinda lapping Microsoft right now in terms of putting features that developers could sure use. Uggh. Even a best guess at utilization would be a hell of a lot better than basically the world's most resource intensive idle loop with the CPU just being clueless and constantly updating for largely static object states and limiting draw call and batch ops, generally for no better reason than "ehhh...," and on top of that, Microsoft has the balls to come out and say (retracted or not!) that DX11 is mostly feature complete? Come onnnnnn
|
# ? Dec 30, 2013 05:21 |
|
Speaking of Mantle, in realistic terms how significant a performance will it likely end up being for AMD cards vs NVIDIA? I've seen hyperbole up to and including predictions that it will be so revolutionary you won't be able to sell a 780Ti for , but I haven't seen a calm analysis of the degree to which it will benefit AMD.
|
# ? Dec 30, 2013 05:44 |
|
Agreed posted:So what's the consumer equivalent of torches and pitchforks to Microsoft's castle? Let's be real--even though OGL is doing some stuff, it still sucks. There's pretty much nothing revolutionary going on, it's minor improvements at best and further evidence that nobody has a clue what the actual successor to GL will be. (or more specifically, the successor to the classic GL pipeline model.) If you put your Remembering Hats on and think back to 2007 or 2008, Larrabee actually tried to do something about this. We can mock Larrabee all we want ("it's what happens when software guys design hardware" is what I heard a lot at the time, and it's pretty true), but I will give them props for trying something to get beyond the standard pipeline. (if you don't actually know much about Larrabee, go read everything linked here) Larrabee failed for two reasons: 1. it sucked tremendously at being a D3D/OGL device. A big part of this was their own naivete ("pfft we don't need a ROP, we'll do it in software"). I think Forsyth in his talk at Stanford mentions that they have a large number of different pipelines implemented for different apps that all have different performance, with no way to tell a priori which way would be fastest. The software costs would be astronomical. 2. it didn't get a console win. lots of reasons for this (Intel being largely indifferent, everybody being really gunshy about exotic architectures after Cell, really high power consumption), but Larrabee only had a chance at doing something interesting if it could get a console win in order to gain widespread developer interest/acceptance/traction. okay, I keep talking about "doing something interesting." what am I talking about? if you go back and look at the graphics pipeline throughout history, it hasn't changed all that much. sure, we've added pixel, vertex, geometry, and now compute shaders in both D3D and OGL. there's tessellation too. lots of new stuff! but there's still fundamentally a pretty straightforward pipeline that ends at the ROP. you can insert interesting things in various places (and people definitely do, see the siggraph papers any given year), but nobody is able to build any sort of arbitrary pipelines with reasonable efficiency. Larrabee's goal was to be able to throw all of the existing model out the window and let sufficiently smart developers implement whatever. want some sparse voxel octree nonsense? sure. micropolys everywhere? also okay. something really weird? yeah fine. (for more on this, read Kayvon Fatahalion's dissertation or his SIGGRAPH talk. actually, just read everything he writes. he's one of Pat Hanrahan's former PhD students, like the one guy that invented CUDA at NVIDIA and the other guy that was one of the main GCN architects, and he is ludicrously smart.) similarly: in the GPU compute realm, nobody's figured out anything to do with the ROP. it is a vestigial weird graphics thing that has no interface that looks anything like a normal programming language and nobody knows how to expose it. if somebody did figure out how to expose it, you could probably end up writing reasonable graphics pipelines in CUDA/OCL/something else. but nobody has, and now that CUDA is purely focused on HPC and OpenCL is focused (insofar as OpenCL has ever been focused at all) on mobile, I don't know that anyone will. (well OCL won't, the CPU and DSP guys won't let the GPU guys enable specialized hardware that they can't emulate quickly) obviously, Microsoft could improve their driver model without addressing these issues (as Mantle tries to do), but there's little reason for them to do so. despite developers whining to the contrary, their driver model is largely fine for its intended use, and it's not clear that existing hardware could support anything better. so if MS doesn't care right now, who's left? Apple could do something Mantle-like on iOS, but they won't until they ship only Rogue-based platforms for a long time (also not sure if it even makes sense there). I doubt they really care on desktop based on their rate of GL version adoption. maybe Android at some point.
|
# ? Dec 30, 2013 07:45 |
Holy crap got stuck in another immediate, no-legitimate-explanation BSOD on boot scenario after playing BF4 for about two minutes. I spent about three seconds trying to fix it and just uninstalled virtu instead, that fixed it immediately, gently caress virtu. My igpu is still disabled from the previous drama anyway so virtu is totally useless to me instead of mostly useless. Still, this is weird, I've had all sorts of monitor configurations on the same setup for 3 years, the only difference is now I have two AMD cards instead of one, anyone else ever have driver issues as crazy as this? edit: well, speaking of Windows 95-style bullshit, I got stuck in a BSOD boot loop for a third time! This time I resolved it by removing my TV tuner... it's also an AMD/ATI card so there may be some driver overlap/interference or something, I recalled seeing some windows notification about the thing having no driver sometime when I was messing around trying to fix the original BSOD problem, and was like hey that's another video thing, may as well remove it and see what happens. I haven't seen anything this loving ridiculous since XP, hopefully this finally fixes it. Straker fucked around with this message at 09:21 on Dec 30, 2013 |
|
# ? Dec 30, 2013 08:30 |
|
El Scotch posted:G-sync is just icing on the cake. G-Sync definitely doesn't work in windowed mode. deimos posted:It has for a while for most titles, it does take a performance hit though. That's a lot of caveats. Any recent sources or benchmarks you know of? I was having a look around but I couldn't see anything specifically about this. HalloKitty fucked around with this message at 10:17 on Dec 30, 2013 |
# ? Dec 30, 2013 10:10 |
|
EkardNT posted:Speaking of Mantle, in realistic terms how significant a performance will it likely end up being for AMD cards vs NVIDIA? I've seen hyperbole up to and including predictions that it will be so revolutionary you won't be able to sell a 780Ti for , but I haven't seen a calm analysis of the degree to which it will benefit AMD. DICE says 10% increased development schedule for up to 20% GPU performance increase.
|
# ? Dec 30, 2013 10:15 |
|
Is it possible to get Shadowplay working with emulators?
|
# ? Dec 30, 2013 10:41 |
|
HalloKitty posted:That's a lot of caveats. Any recent sources or benchmarks you know of? I was having a look around but I couldn't see anything specifically about this. No, I just remember reading they got it working circa 2011 and seeing some amateur benchmarks for it.
|
# ? Dec 30, 2013 13:48 |
|
Well I had both my GTX 570 and 780 installed with the idea that I'd keep the 570 folding 24/7 or have it for Physx games, but I ran into several issues and pulled it back out of this machine. First, folding@home does not use the normal index for the GPUs... but it does for CUDA/OpenCL indexes. This results in all kinds of fuckery with the config.xml to find which settings will get them to fold, but then I ran into issues with GPUs folding, and the work units just disappearing when pausing them. Second, the waterblocks don't line up (pretty much expected this), so the water was looping through the 780 before hitting the restrictive 570 block versus having the flow split between them. This resulted in adding about 15-20C onto my CPU and 780 temperatures. While there was some performance improvement from the dedicated Physx card, it was insignificant to the raw power of the 780. Batman origins already runs maxed out in the 60-90fps range and with the dedicated Physx card, it gained around 10fps. I'm going to throw that 570 into another machine at this point and just work on overclocking/BIOS tweaking the 780. e: on the topic of Shadowplay. Has anyone encountered an issue where the audio is not captured. Or if it is captured, it is only the rear/surround audio? Figured this out. Asus Xonar DSX has an option called GX and this is what was causing audio to go missing in Shadowplay. Phuzun fucked around with this message at 17:09 on Dec 30, 2013 |
# ? Dec 30, 2013 14:22 |
|
I just purchased 2 X-Star DP2710 monitors hoping for a dual setup with both running 1440. Unfortunately I only have one DVI port on my Radeon 7870. I was about to just get a Mini-DP to DVI adapter that's about $100, but then I thought what if I get a new card? So I'm seeing all the GTX 660/670/680 cards have dual DVI outputs. Is it worth it for me to upgrade from Radeon 7870? Or should I buy an adapter for $100 now and then wait for a more significant bang for my buck a year down the road.
|
# ? Dec 30, 2013 15:21 |
|
sedaps posted:I just purchased 2 X-Star DP2710 monitors hoping for a dual setup with both running 1440. Unfortunately I only have one DVI port on my Radeon 7870. I was about to just get a Mini-DP to DVI adapter that's about $100, but then I thought what if I get a new card? Would these cables not work for your card/display? They are much cheaper than $100. http://www.monoprice.com/Category?c_id=102&cp_id=10246&cs_id=1024604
|
# ? Dec 30, 2013 15:30 |
|
Or get one of the short adapters like they sell for Mac laptops.
|
# ? Dec 30, 2013 15:39 |
|
Phuzun posted:Would these cables not work for your card/display? They are much cheaper than $100. I think those are single link DVI. This is the monoprice version of what I need, although the reviews are not so great on them from some research.
|
# ? Dec 30, 2013 16:13 |
|
sedaps posted:I think those are single link DVI. This is the monoprice version of what I need, although the reviews are not so great on them from some research. Yeah, looking at them now, they do appear to be single link. You had mentioned the GTX 600 series, but AMD also has cards that feature 2 dual-link DVI outputs, including other 7870s. Instead of buying an adapter, you could get a second 7870 for the extra performance and even have a third DVI output for later.
|
# ? Dec 30, 2013 16:54 |
|
HalloKitty posted:G-Sync definitely doesn't work in windowed mode. That's good to know. That said, I would probably suck it up and play full screen to use it.
|
# ? Dec 30, 2013 16:55 |
|
Professor Science posted:(if you don't actually know much about Larrabee, go read everything linked here) GPUs don't work like they think you do: the link
|
# ? Dec 30, 2013 19:19 |
|
BurritoJustice posted:DICE says 10% increased development schedule for up to 20% GPU performance increase. Those figures aren't AMD vs Nvidia, but rather Mantle vs D3D. Once the API is actually publicly released we should be able to see some benchmark comparisons between an optimized AMD path with Mantle and an OpenGL path that uses the Nvidia extensions.
|
# ? Dec 30, 2013 19:54 |
|
Straker posted:Holy crap got stuck in another immediate, no-legitimate-explanation BSOD on boot scenario after playing BF4 for about two minutes. I spent about three seconds trying to fix it and just uninstalled virtu instead, that fixed it immediately, gently caress virtu. My igpu is still disabled from the previous drama anyway so virtu is totally useless to me instead of mostly useless. As pointed out to me by FF, you're not the only one experiencing some serious problems with the unbelievably buggy state of BF4. Stockholders are too! P.S. first production proven Mantle patch now moved to January due to lawsuits, welp Agreed fucked around with this message at 20:56 on Dec 30, 2013 |
# ? Dec 30, 2013 20:54 |
|
I was gonna wait for the 800 series but im just too tempted. Ordered an MSI 760 twin frozr. Looks like the coolest card that will fit in an SG05.
|
# ? Dec 30, 2013 21:07 |
|
Have any of the reviewers with Gsync monitors tried SLI with it yet? Seems pretty game changing if micro-stuttering is eliminated or reduced greatly with it.
|
# ? Dec 30, 2013 22:22 |
|
Thanks to cryptocurrency mining I was able to sell my six month old 7950 for $50 more than I paid for it. Upgraded to an R9-290. Spelunky and Papers, Please are running super smooth.
|
# ? Dec 31, 2013 02:20 |
|
|
# ? May 9, 2024 07:23 |
|
Purgatory Glory posted:Have any of the reviewers with Gsync monitors tried SLI with it yet? Seems pretty game changing if micro-stuttering is eliminated or reduced greatly with it. From nVidia's G-Sync FAQ: quote:Q: How does NVIDIA G-SYNC work with SLI? I'm not sure what overhead there would be associated with managing the display output, but I would imagine that it isn't very much considering most of the heavy lifting is done by the scaler replacement FPGA as opposed to the GPU. nVidia has emphasized frame pacing and coherence over raw additional card scaling for generations. We won't know anything solid beyond the previously stated performance hit for G-Sync's super duper frame pacing, but I don't think it would be anything higher, proportionately, for two cards versus one, especially since only one card is having to do the timing stuff for the monitor. Still, this isn't church, wait for reviews before making an assessment.
|
# ? Dec 31, 2013 03:19 |