Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Guni
Mar 11, 2010


Agreed posted:

Not that thin-skinned, haha, I just really do think I've gone too far in the direction of making fun of myself and my purchases to the point that it's now assumed that I'm going to do something stupid - and, well, people who do stupid things don't generally get much traction in more serious discussions. My feelings aren't hurt, nobody here did anything wrong, I'm just worried that I've accidentally made people more likely to associate me with buying new tech rather than knowing about and talking about new tech. I do my very best to stay well-informed, within the boundaries of what we can know without people breaking NDA (and of course keeping tabs on the rare occasions that does happen ), so it'd be disappointing if it turned out that nobody is actually listening when I try to talk about stuff that interests me. No intention of being some weird graphics buffoon, in other words.

:defendyourpurchase:

Adbot
ADBOT LOVES YOU

Ghostpilot
Jun 21, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

I can't really talk: I sent back my defective 290 and instead of waiting for the refund before getting a new card, I ordered another one the same day I got my RMA number.

Had I waited for the refund it may have taken 2-3 weeks before I had another card in my hands, and that's if there weren't further delays due to shortages from BF/CM sales. I also figured that by then all the unlockable cards would've long since been snatched up.

So until I get that refund (which hopefully there won't be any issue with), I'm in for over $800 bucks.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down



Guni posted:

:defendyourpurchase:

No. Here, let me try again, with a little more brevity.

This thread has been around for a while, and what started as a sort of inside joke of "do what I say, not as I do" because I tend to prefer higher performance and am willing to pay a price premium for it, in the context of a forum where we almost always encourage people to value price:performance most heavily, has turned into more than it was ever intended to be. This is especially problematic for me because newer folks who come here don't understand that I actually do know what I'm talking about, and do my best to help others with analysis, etc., in addition to participating in the more interesting discussions, which are unfortunately becoming less frequent - the level of discussion surrounding AMD's launch conference and nVidia's Way It's Meant To Be Played conference never really got particularly exciting, and we didn't talk about the AMD developer conference last month at all. Some of the SH/SC industry insiders and experts just don't really come around anymore for one reason or another, and it's a shame.

Even so - there's still good things to do with the thread and discussion. I want to continue to be a part of that. But the downside of less esoteric and more accessible discussion is that many of the new people coming in could very easily see a bunch of jokes and get the wrong idea about who I am. I don't like being portrayed as just foolish, even if it can be said to just be a joke, the thread has been more about recommendations for cards or post-your-recent-buys lately and that's brought in a lot more traffic than it used to get and plenty of new faces, who have no real context for "oh, that's a joke" and might think I actually do just do all sorts of dumb poo poo for no good reason.

If you didn't know me and know that I pay really close attention to keeping tabs on new developments in graphics cards and rendering tech, or that (if I can talk FactoryFactory into it, as he's quite busy!) we'll be taking over the OP from Movax by his request in the near future, you could easily come away with the impression that I'm just another [H] idiot who you probably just shouldn't listen to.

I would prefer for that not to be the case. That's all.

Ghostpilot
Jun 21, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

I absolutely understand where you're coming from. I hadn't realized that it'd become that much of a thing. It hasn't affect your credibility in my eyes but I can see how it would for newcomers to the thread. Especially when you are such a font of information. It's really just a bit of good-natured ribbing, but I can see how it's gotten a little out of hand. Allow me to be the first to apologize for my contribution to it.

antman
Nov 1, 2011


This thread is still awesome for people like me who don't keep up with the updated hardware constantly.
Between this thread and the Build Your Own PC stickied thread I've cut the cost on the current PC I'm looking to build by a few hundred dollars.

So here's my question, considering it's about $100 more is there a reason I should buy the GTX 780 over the R9 290? I'm not going to miss the $100 if there is a good reason to go with the Nvidia card.
But, the 290 seems to be pretty comparable as far as all the benchmarks I've looked at.
Going to be playing Battlefield4 @ 1080p single screen.

Ghostpilot
Jun 21, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

antman posted:

This thread is still awesome for people like me who don't keep up with the updated hardware constantly.
Between this thread and the Build Your Own PC stickied thread I've cut the cost on the current PC I'm looking to build by a few hundred dollars.

So here's my question, considering it's about $100 more is there a reason I should buy the GTX 780 over the R9 290? I'm not going to miss the $100 if there is a good reason to go with the Nvidia card.
But, the 290 seems to be pretty comparable as far as all the benchmarks I've looked at.
Going to be playing Battlefield4 @ 1080p single screen.

To be honest, either card would be overkill at that resolution. I'll say this, though: other than a Kyro II way back when, I'd been in the Nvidia camp for about a decade. If the pricing was a bit more comparable to what AMD's offering (especially for my 1440 resolution), I'd have stuck with team green: their stuff just works™. No muss, no fuss.

AMD's value is just really hard to compete with right now, especially with many 290's unlocking to 290x's. If it weren't for the 290 line, I'd probably be running an Nvidia right now.

Edit: Just to add to this: if you really want to get the most out of a 290, you'll have to get your hands dirty with an aftermarket cooling solution. Once you factor that in (about $50 bucks), the margin between it and the 780 becomes much closer.

Ghostpilot fucked around with this message at 03:45 on Dec 4, 2013

Stanley Pain
Jun 16, 2001

Bit. Trip. RIP.


El Scotch posted:

I'm hardly any better. As Agreed and I poked fun at each other about last week, I have two 290s. I'm not much of a model of sensible choices.

New drivers fixed my black screens. Can finally use both cards


It's nice being able to play BF4 + Ultra + 150% Rendering. Game looks insanely good at 1440p.

BurritoJustice
Oct 9, 2012



antman posted:

This thread is still awesome for people like me who don't keep up with the updated hardware constantly.
Between this thread and the Build Your Own PC stickied thread I've cut the cost on the current PC I'm looking to build by a few hundred dollars.

So here's my question, considering it's about $100 more is there a reason I should buy the GTX 780 over the R9 290? I'm not going to miss the $100 if there is a good reason to go with the Nvidia card.
But, the 290 seems to be pretty comparable as far as all the benchmarks I've looked at.
Going to be playing Battlefield4 @ 1080p single screen.

They offer comparable performance. The main upside of the 780 on the hardware front is lower temperatures, much lower noise, and reasonably large overclocking headroom. On the software front, AMD offers TrueAudio and Mantle (neither of which are actually implemented in games yet, and future support announcements are limited as of now). Mantle promises performance improvements in games using it (BF4 is actually the only game that I know of that is going to be updated to use it as of now, so that could sway your purchase), and TrueAudio is GPU accelerated positional audio (a bit hazy on this, someone can correct me if I am wrong). Nvidia offers PhysX (limited support but pretty drat cool visual effects in games that use it), G-Sync (not out yet, and requires additional hardware, but promises to remove screen tearing/latency), as well as a bunch of minor features like TXAA (fancy looking AA) and Adaptive VSync (turns VSync on and off based off your framerate).

In my opinion, I personally see the 780 as worth the extra money over the 290, but if you are just looking for the level of performance and you don't care about all of the smaller quality of life benefits of the 780, the 290 offers better performance-per-dollar.

antman
Nov 1, 2011


Thank you both for your analysis. I'm not interested in OCing that much at least for now and the software seems like AMD might have the edge in the future with Mantle in BF4 performance.
Unless they are just full of poo poo or something ha. AMD and Dice seem to be working with each other.

jink
May 8, 2002

Drop it like it's Hot.

Taco Defender

Rahu X posted:

Thanks for the tip. I was actually eyeing those exact heatsinks.

Also, upon browsing eBay, I've come across a "lightly used" Antec Kuhler 620 going for $48.99 with free shipping. That with 2 8 packs of those heatsinks only brings me to ~$58.

So, I think I'm going to hop on that before it's too late. Sorry jink. This thing does have a 14 day no hassle return on it though, so if it ends up being absolute poo poo when I get it, I'll be sure to announce it in this thread and I'll be open for business again.

Now I just need to decide what to do for some VRM heatsinks. Those Enzotechs are so goddamn pricey. Debating on just getting some RAM heatsinks and cutting them up.

Wow, yeah you need a lot of heatsinks. That image of the naked card.

It's fine that you aren't buying from me. I can't compete with that pricing; I bought a couple 620s for $50 and coupled with $13 shipping makes the margins very slim. If you or anyone else is interested in AIO cooling of a video card... holla.


[EDIT]: drat this thread has been moving fast!

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down



nVidia side also has the whole nVidia Experience software package, which is getting better and better with each release. I was initially skeptical but having actually used it, it does an excellent job and they're keeping the whole "adjust your settings based on your hardware and the game you're playing for the best ratio of looks to performance" library well-curated. It's nice that you can now manually adjust what you consider to be optimal settings if, for example, you prefer TXAA over a different AA method that it suggests and are willing to incur the performance hit (even with a GTX 780Ti it hesitates to select TXAA if FXAA is available, because TXAA carries the same performance hit, roughly, as MSAA, while FXAA has practically no performance hit at all).

It's also pretty neat in that it does it all without having to try to crowdsource anything; AMD has tried to put up a competing software called Gaming Evolved, but it hooks into the social networking site Raptr for its optimizations and has some uncomfortable levels of residency and sharing that you may not care for. I like that nVidia's version is all done in-house without having to partner with a third party, and anything you want online is opt-in. To be fair, AMD probably genuinely can't afford to do something like nVidia Experience on their budget, so it kinda makes sense that they'd partner up, but be aware of the difference.

Something that demands mentioning in the proprietary stuff comparison is a really cool, really significant bit of the nVidia Experience software called Shadowplay, an EXTREMELY low resource utilization, high-definition (or above) video encoder that takes advantage of a hardware encoder on the 600- and 700-series cards (and onward, safe assumption). Its features are really neat, allowing for automatic and manual recordings with a very nice list of cool stuff it can do. It's still in beta but the last update changed it from only being able to record in 1080p to being able to record in other resolutions with no aspect ratio issues as well. It has the capability to grab footage on the fly, as well as get stuff that you didn't explicitly tell it to record so long as it happened in the last ten or twenty minutes (it keeps a buffer full as it goes, so that you can grab your cool whatever on your own schedule). File size is comparable with anything else out there, CPU overhead is practically nonexistent since it's all on the GPU, it now can record audio input from your mic as well, and all without causing any chugging in framerate or inefficiency in CPU utilization.

antman posted:

Thank you both for your analysis. I'm not interested in OCing that much at least for now and the software seems like AMD might have the edge in the future with Mantle in BF4 performance.
Unless they are just full of poo poo or something ha. AMD and Dice seem to be working with each other.

AMD bought the opportunity to do Mantle for $8mil, paid to EA. DICE's engine will be used in several games, as one might expect - I've seen the number 24 come up a few places, and I understand that all parties with any leverage will be leaning on them to implement Mantle support. Don't be fooled by some of the common misconceptions about it, though. For one, it is not "just like the consoles." They have their own development tools and aren't based on Mantle at all. And some of the figures people have thrown around as potential savings are pretty , I'm waiting for a practical implementation before passing judgment on its efficacy.

It's not magic, it's mostly a more efficient way to get rendering done without spending so much time on the CPU. The concept of a more efficient but just as robust graphics API as DirectX 11/D3D11 is great in theory, but there are some hurdles that are difficult to overcome, and it's not entirely clear as of yet how they will be overcome. What we do know for sure is that there is a very significant amount of overhead in DirectX, less so in OpenGL if programmed very tightly but there aren't many developoment studios using OpenGL on the PC (the last major game to use it was RAGE, and that was a flop that seems to have pretty much killed id studios). When they say they can reduce draw call CPU time by a large amount, that's actually pretty feasible, if for no other reason than it's gotta be the lowest hanging fruit since it takes so damned much time in DX/D3D.

SourKraut
Nov 20, 2005

POST QUALITY UNDER CONSTRUCTION




Agreed, I know you vastly prefer Team Green, but in all the talk about proprietary technologies on each side, let's not forget about TrueAudio for AMD. :P

While similar to Mantle in that we need to wait to see what ends up coming of it, TrueAudio has the chance to truly make quite a nice little change to how the audio API is handled with games, and given that the PS4's DSP *is* based on TrueAudio, I do think we will see it put to use quite a bit. It's one of those interesting technologies in that nVidia really has nothing at all like it, and they don't really have an easy way to push it compared to AMD having it in the PS4.

As for Mantle, while neither the PS4 or the XBONE are specifically using Mantle, I thought that comments by Sony's Centry and from developers at DICE had indicated that the PS4's Graphics API and Mantle share many of the same optimizations, leaving open that porting over games from PS4 to PC and still taking advantage of the console graphics optimizations could be seen PC-side. It'll definitely be interesting to see, but if AMD can leverage some of it, it'll definitely help long-term with performance vs. nVidia.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down



TrueAudio was covered by the gent before me quite well, and while I do prefer nVidia cards (because AMD doesn't have G-Sync, and because nVidia seems concerned with solving very relevant issues with temporal aliasing and other image quality things in addition to MAKE GO FAST), you don't have to go back very far to see a quick breakdown I did of the hardware improvements from Tahiti XT/XT2 vs. Hawaii, and in the same post I noted that AMD is winning the price:perf war at all price points right now. The only thing that I don't like about AMD is that they're stretched so damned thin that they have to take gambles that may or may not pay off, whereas nVidia tends to be able to just ride things out thanks to their success in all markets apart from the consoles. I think it's a bit of a misconception that nVidia's drivers "just work" - while it tends to be true for videogames on their most recent hardware, ask anyone who bought a 560Ti when they were all the rage just how well the drivers are working for them In fact, support for Fermi in general has been pretty poor until the most recent two or three betas, there were a few months there where if you didn't want random crashing and even bluescreens your best bet was sticking with drivers no higher than 314

As far as Mantle and console vs. PC goes, I'd look more at hardware similarities as a reason to think that AMD might get some traction with developers. Given that the real problem is WDDM sitting there like a lazy but insistent watchdog, what is it about AMD's cards that might let them get around it to improve efficiency and reduce rendering overhead without breaking The Rules on Windows? I've gotta bread-crumb that one until they go practical with it and we see how it actually pans out, but I have an idea of how they might do it that isn't infeasible (but also isn't ~basically Glide 2.0 to a significant degree).

DICE definitely will be pushing it, but they're one engine up against very strong competition. I reckon we'll see Epic pushing PhysX and other nVidia proprietary technologies on their end, and there have been some surprising comments by strong in-house engine developers regarding PhysX support. I keep waiting for it to just up and die already so I can get this coprocessor out of my system but as long as cool new games keep supporting it and frankly look way better when they do so in a way that's more than just "well here's a cloth look at it flutter wheee " I'll hang around with it.

I'm more interested in some of the integrations of forward and deferred rendering coming from parties without a particular dog in the "which card u-arch is best" fight, frankly? I mean, it's going to be great for AMD buyers to not only have the best current value prospect period, but then to ALSO get some extra "free" performance on top of that in Mantle enabled games (largely based on how much said games are CPU-bound, since if it works like I think it works it's going to do almost all of its work at the level of no longer having to spend so much damned time chatting it up with the CPU). Not knocking that. I would hope that you can see I do my best to be fair to both companies, here, man, I get excited about technologies like anybody else. I just would prefer that they're a little more transparent - you can get really robust documentation for the key nVidia technologies mentioned here, but Mantle is awfully proprietary for reasons that I think will become more apparent when it's put forward practically.

In the meantime, want to know about Mantle? You'll need to sign here, there, there, and here, and by the way, speak up about it with true confirmed belief and you're going to get sued to oblivion. http://developer.amd.com/?s=mantle

We won't get full disclosure documentation until GDC 2014 in March. Hey, by then we might (if the rumor mill is true regarding the possibility of an early release due to TSMC loving the dog on the next two process shrinks) know what that ARM CPU on Maxwell cards is up to, too.

Agreed fucked around with this message at 05:47 on Dec 4, 2013

SourKraut
Nov 20, 2005

POST QUALITY UNDER CONSTRUCTION




I'm either super-tired or blind, but where was the TrueAudio length discussion? I missed it. It's the one feature that has me still slightly on the fence about going R9-290, or with the MSI 780 Lightning I keep drooling over (though if I did go the 290 route, I'd be stuck waiting either for the Lightning there or a Matrix-edition. :aaaah:).

I was just ribbing you a little about it all. I did like your Tahiti-XT discussion, and I actually fully agree regarding many of AMD's issues in terms of "betting the farm" on particular products or technologies, or otherwise having shoddy implementation and support and times. It's part of why it's still a 70-80% chance I"ll go Team Green again on this next GPU purchase. Plus I'm always a little partial to nVidia's SLI implementation, if only because I gamed during 3dfx's glory years, so the couple of times I've had an SLI implementation it felt like I was indirectly utilizing a 3dfx implementation again. (though obviously not really).

Out of curiosity though, as I don't really know too much about it - how open has nVidia been with NVAPI by comparison?

Ghostpilot
Jun 21, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

A friend is looking for a 7970 for his girlfriend's PC and the prices on them have been skyrocketing lately (within $20 of 290 prices) and as it turns out it has to do with the rising Litecoin values, which apparently AMD cards are exceptional at mining.

I have no idea what all of that is about.

John McCain
Jan 28, 2009


Agreed posted:

I think it's a bit of a misconception that nVidia's drivers "just work" - while it tends to be true for videogames on their most recent hardware, ask anyone who bought a 560Ti when they were all the rage just how well the drivers are working for them

In case anyone else has been having the problem with constant bluescreens with modern drivers on Fermi cards and doesn't know how to fix it, going to Power Options and turning PCI-E Link Power Management to "off" fixed the problem for me. Of course, it makes my computer consume ~90W more at idle, but it's winter anyway.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down



Anyone still using GPUs to mine for *coins is even dumber than the general idea of an internet-based currency with no way to convert it to real money, they have USB ASICs specifically designed to do the exact calculations that bitcoins need and they make the highest end cards look awful in terms of price:performance:watt (where wattage is sort of THE major factor of price at the difficulties they're up to now). Ugh that's dumb. The whole thing, it's gotta go away at some point, right? I mean, they're just recreating the same system they don't like, except they're doing it pretty poorly... Whatever, not topical, but it sucks that it's affecting prices on older cards that would otherwise be really good deals.

SourKraut posted:

I'm either super-tired or blind, but where was the TrueAudio length discussion? I missed it. It's the one feature that has me still slightly on the fence about going R9-290, or with the MSI 780 Lightning I keep drooling over (though if I did go the 290 route, I'd be stuck waiting either for the Lightning there or a Matrix-edition. :aaaah:).

I was just ribbing you a little about it all. I did like your Tahiti-XT discussion, and I actually fully agree regarding many of AMD's issues in terms of "betting the farm" on particular products or technologies, or otherwise having shoddy implementation and support and times. It's part of why it's still a 70-80% chance I"ll go Team Green again on this next GPU purchase. Plus I'm always a little partial to nVidia's SLI implementation, if only because I gamed during 3dfx's glory years, so the couple of times I've had an SLI implementation it felt like I was indirectly utilizing a 3dfx implementation again. (though obviously not really).

Out of curiosity though, as I don't really know too much about it - how open has nVidia been with NVAPI by comparison?

Maybe it wasn't in-depth, but I'm one of the few people on here that chooses to communicate in a shitload of paragraphs, so forgive others their sensible brevity TrueAudio is cool, I've been excited at the prospect of GPGPU wavetracing since before we learned they were actually going with a DSP chip on-board instead, but whatever, it's neato, though if it beats the general rule of "the more proprietary, the less well supported" it'll only be because some console devs go apeshit with it. I don't recall it being the PS4's sound chip, but it's been a bit since I read a hardware analysis since I heard they were going to be releasing a hardware revision somewhat soonish and that has me interested in what's broken and what they intend to fix. If it pans out. Well, that's the rumor mill for you. Haha.

nVidia was at one time somewhat closed regarding NVAPI, but have since opened it up publicly. You can check it out here:

https://developer.nvidia.com/nvapi

But I don't believe Mantle is going to be anything like it, for what it's worth. Oh, and you'll still have to do the whole NDA dance if you want to get to the super low-level stuff. nVidia aren't the good guys, AMD aren't the bad guys, they're both companies trying to one-up each other constantly and we benefit from this and that is awesome.

Agreed fucked around with this message at 06:49 on Dec 4, 2013

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars




John McCain posted:

In case anyone else has been having the problem with constant bluescreens with modern drivers on Fermi cards and doesn't know how to fix it, going to Power Options and turning PCI-E Link Power Management to "off" fixed the problem for me. Of course, it makes my computer consume ~90W more at idle, but it's winter anyway.

Reported unreliable - again, by GeForce Community schmucks, so not even God knows how well it holds - and also 331.93 actually looks promising so far. Not going to call it fixed until it reaches a week or two without issue, though.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!


BTW, Shadowplay stopped being such a huge advantage with Rivatuner SS 5.3, it supports near-free capture by encoding video using Intel's Quicksync.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!


Agreed posted:

We won't get full disclosure documentation until GDC 2014 in March. Hey, by then we might (if the rumor mill is true regarding the possibility of an early release due to TSMC loving the dog on the next two process shrinks) know what that ARM CPU on Maxwell cards is up to, too.

SteamBox: Just add video card.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down



deimos posted:

BTW, Shadowplay stopped being such a huge advantage with Rivatuner SS 5.3, it supports near-free capture by encoding video using Intel's Quicksync.

I dunno, Alexey seemed like a pretty big fan! Why, he even had plans to integrated it.

Alexey Nicolaychuk aka Unwinder, RivaTuner creator, on 10-29-2013 posted:

I’ve read a lot of reviews related to NVIDIA ShadowPlay during the last couple days and there are typical misunderstanding patterns in many of them, so I decided to write this short summary to give you better understanding, guys.

Most of users (and surprisingly reviewers) assume that the main performance hit for any videocapture oriented software comes from realtime video encoding. That is quite typical and mostly wrong assumption. Modern videocapture applications are multithreaded and modern CPUs have multiple cores able to process multiple tasks in parallel, so realtime video encoding is normally performed in background thread(s) and often with lower priority. That is why when you’re capturing video in the game which is not using close to 100% of CPU time on all CPU cores, background video encoding have enough system resources to run in parallel and hardly affects game performance. In other words, having fast hardware encoder like NVENC or Intel QuickSync simply allows you to encode better quality video with higher encoding framerate or more efficient compression ratio in realtime, but it doesn’t seriously minimize videocapture related performance hit like many reviewers say.

The main and the real performance penalty for videocapture application is … a simple process of capturing frames from DirectX application. DirectX rendering pipeline is asynchronous, so multiple frames are being processed in parallel by different stages of the pipeline. And that’s where our performance penalty come from – DirectX architecture is simply not built with idea of effective frame readback from the very end of rendering pipeline. So each frame capture via DirectX causes DirectX rendering pipeline to be flushed, CPU may stall until GPU finish flushing the pipeline and provide captured frame back to CPU, etc, etc. This means that each simple frame capture iteration hurt the efficiency of prerendering, may seriously reduce the performance of multi-GPU systems and so on.

And that’s where NVIDIA ShadowPlay magic begins. I’m very supersized to see that all reviewers focus on Kepler’s NVENC hardware H.264 encoder only without even mentioning Kepler’s innovative and unique NVFBC (Frame Buffer Capture) and NVIFR (Inband Frame Readback), which also help NVIDIA ShadowPlay a LOT. Those two are probably even more important videocapture oriented hardware technologies introduced in Kepler GPUs. Both are promoted by NVIDIA as ultra fast low-latency GPU/DMA accelerated framebuffer capture techniques, both are targeted at cloud gaming / GRID systems (you can read more in this presentation). It is some very very “light” kind of NVIDIA’s “Mantle” in frame capture functionality area – NVFBC and NVIFR are available via NVAPI and provide developers very effective way of frame capture via direct access to hardware bypassing the limitations of DirectX API.
So ShadowPlay’s success is built on two key hardware technologies: NVFBC for very effective frame capture and NVENC for very effective frame compression. BTW, NVFBC is a direct reason why ShadowPlay is not capturing video in windowed mode, if you follow the link above, you’ll see that NVFBC is able to grab whole frame buffer only.

And now some really good news: using NVENC is third party video capture applications is currently troublesome due to some strange licensing scheme applied to it by NVIDIA. Hope it will change in future, because competing Intel QuickSync H.264 encoding is freely available to any developer and it does great job. But NVFBC and NVIFR interfaces are available to any NVAPI developer, which means that very effective frame capture can be easily added to ANY existing third party videocapture application right now with minimum development cost. So at least frame capture related bottleneck can be taken out of context on NVIDIA Kepler hardware. Personally I’ve chosen to power videocapture engine of RivaTunerStatisticsServer v5.5.0 by NVIFR, even considering that NVFBC provides a bit faster capture, I won’t like to sacrify windowed videocapture support. And I’m pretty sure that other video capture tools will also get support for those new NVIDIA technologies soon."

But then something happened.

The Same Guy, a while before posted:

- Added external QSV.DLL encoding plugin. The plugin provides you high-performance hardware accelerated H.264 encoding on Intel QuickSync Video capable platforms. Intel QuickSync H.264 encoder is able to compress 1080p video at 60 FPS with no major CPU performance hit. Hardware accelerated Intel QuickSync H.264 encoder was introduced special to compete with NVIDIA’s ShadowPlay hardware accelerated H.264 encoder. Free hardware accelerated H.264 video capture and encoding is no longer an exclusive selling point of NVIDIA Kepler GPU family, now the same functionality is available on much wider range of hardware platforms on both AMD and NVIDIA GPU based graphics cards absolutely for free!

Hmm.

Guessing some money changed hands there, what do you think?

Edit: Seriously, what do you think, it can be read two ways - is he going from the broadly compatible QuickSync solution found in, what, 5.3 forward to ShadowPlay only in 5.5 forward? If so, why? Because it's better? Because he got paid? Or am I just misreading him entirely?

Agreed fucked around with this message at 15:47 on Dec 4, 2013

Stanley Pain
Jun 16, 2001

Bit. Trip. RIP.


Agreed posted:

I dunno, Alexey seemed like a pretty big fan! Why, he even had plans to integrated it.


But then something happened.


Hmm.

Guessing some money changed hands there, what do you think?

Edit: Seriously, what do you think, it can be read two ways - is he going from the broadly compatible QuickSync solution found in, what, 5.3 forward to ShadowPlay only in 5.5 forward? If so, why? Because it's better? Because he got paid? Or am I just misreading him entirely?


The ability to hit a broader market?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down



It seems like his English isn't absolutely stellar, so maybe I'm misinterpreting, but it sounds to me like he's changing it from QuickSync to ShadowPlay. That's going from Intel proprietary (which pretty much everyone has these days) to nVidia proprietary and only from the 600-series onward and not all cards. Maybe it's something he would have done all along if NVAPI had been sorta open to public developers before, maybe it really just is a much better capture technology (QuickSync videos don't look very good to me, while Shadowplay videos look great to me). I'm curious as to whether he means to replace it completely or just add support - the way it's worded sounds more like he's changing over entirely to Shadowplay.

Straker
Nov 10, 2005


Holy christ, retard litecoin miners are buying 7970s for $350 now (as everyone in this thread should know, they are worth only like $240 new). Great time to ebay yours (or likely any good ATI card) if you're due for an upgrade! I wonder if I could get $600+ for my 7990...

edit: seriously, what the gently caress, $400 for a USED 7970 GHz. I guess the price has gone crazy because of the "lack of supply", they're too stupid to buy 280Xes instead. Now I'm sad I didn't go with crossfired 7970s after all...

edit 2: holy crap 7990s are going for over $800, I know what I have to do now

Straker fucked around with this message at 18:23 on Dec 4, 2013

Shadowhand00
Jan 22, 2006

Golden Bear is ever watching; day by day he prowls, and when he hears the tread of lowly Stanfurd red,from his Lair he fiercely growls.

Toilet Rascal

Holy, I'm going to sell one of my 5970s today. I've seen a few go for $400.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down



Hahaha this is awesome, have fun letting those idiots screw themselves and upgrading your graphics experience to an R9 290 in the bargain. GPUs are so power inefficient compared to ASICs designed specifically for Bitcoin mining, and the exchange rate doesn't allow for any sort of arbitrage in their pseudocryptocurrency, so I have no idea why in the gently caress they're going apeshit over these but get while the getting's good and come out the other side with a modern, badass graphics card.

SourKraut
Nov 20, 2005

POST QUALITY UNDER CONSTRUCTION




I should see what I can sell my old reference 5850 for at this rate, it's still going strong.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.


Agreed posted:

Hahaha this is awesome, have fun letting those idiots screw themselves and upgrading your graphics experience to an R9 290 in the bargain. GPUs are so power inefficient compared to ASICs designed specifically for Bitcoin mining, and the exchange rate doesn't allow for any sort of arbitrage in their pseudocryptocurrency, so I have no idea why in the gently caress they're going apeshit over these but get while the getting's good and come out the other side with a modern, badass graphics card.

Litecoin is Bitcoin designed to use a different algorithm, meant to eliminate the CPU -> GPU -> ASIC prisoner's dilemma. They failed and GPU mining is 10x more efficient, but ASIC implementations will be TONS more expensive to implement.

The other change is that blocks generate faster and there will be 4 times as many fireflies as butts, so you get MORE COINS FASTER.

Stanley Pain
Jun 16, 2001

Bit. Trip. RIP.


Agreed posted:

It seems like his English isn't absolutely stellar, so maybe I'm misinterpreting, but it sounds to me like he's changing it from QuickSync to ShadowPlay. That's going from Intel proprietary (which pretty much everyone has these days) to nVidia proprietary and only from the 600-series onward and not all cards. Maybe it's something he would have done all along if NVAPI had been sorta open to public developers before, maybe it really just is a much better capture technology (QuickSync videos don't look very good to me, while Shadowplay videos look great to me). I'm curious as to whether he means to replace it completely or just add support - the way it's worded sounds more like he's changing over entirely to Shadowplay.

Hahaha, I read that completely backwards. Now I'm not even sure what he's attempting to do.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!


Agreed posted:

Edit: Seriously, what do you think, it can be read two ways - is he going from the broadly compatible QuickSync solution found in, what, 5.3 forward to ShadowPlay only in 5.5 forward? If so, why? Because it's better? Because he got paid? Or am I just misreading him entirely?

He doesn't say he's going NVIFR only does he, just that he's going to add it.

Specifically this chunk: (bolding mine)

quote:

But NVFBC and NVIFR interfaces are available to any NVAPI developer, which means that very effective frame capture can be easily added to ANY existing third party videocapture application right now with minimum development cost. So at least frame capture related bottleneck can be taken out of context on NVIDIA Kepler hardware. Personally I’ve chosen to power videocapture engine of RivaTunerStatisticsServer v5.5.0 by NVIFR, even considering that NVFBC provides a bit faster capture, I won’t like to sacrify windowed videocapture support. And I’m pretty sure that other video capture tools will also get support for those new NVIDIA technologies soon.

deimos fucked around with this message at 18:50 on Dec 4, 2013

eXXon
Aug 19, 2002
<img src="https://fi.somethingawful.com/customtitles/title-exxon.png"><br>



Are *coin miners also paying stupid prices for R9 280x's? Because I got one for $250 and don't really need a card that beefy, although I could just keep it for the awesome bargain it was.

Are there any rumours about the Radeon Rewards/Never Settle bundle getting updated? All I got was a lame Bronze reward, and I don't really want Far Cry: Blood Dragon and already played DX:HR and Sleeping Dogs.

Josh Lyman
May 24, 2009





Straker posted:

Holy christ, retard litecoin miners are buying 7970s for $350 now (as everyone in this thread should know, they are worth only like $240 new). Great time to ebay yours (or likely any good ATI card) if you're due for an upgrade! I wonder if I could get $600+ for my 7990...

edit: seriously, what the gently caress, $400 for a USED 7970 GHz. I guess the price has gone crazy because of the "lack of supply", they're too stupid to buy 280Xes instead. Now I'm sad I didn't go with crossfired 7970s after all...

edit 2: holy crap 7990s are going for over $800, I know what I have to do now
Just listed my 2 week old 7970 GHz edition for $399 Buy It Now. Come on bitcoiners!

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you


Josh Lyman posted:

Just listed my 2 week old 7970 GHz edition for $399 Buy It Now. Come on bitcoiners!
Let us know if you sell it because if it does go I'm going to list my 7970 with ACIII cooler on it for that much or slightly more.

Ham Sandwiches
Jul 7, 2000



Agreed posted:

I dunno, Alexey seemed like a pretty big fan! Why, he even had plans to integrated it.


But then something happened.


Hmm.

Guessing some money changed hands there, what do you think?

Edit: Seriously, what do you think, it can be read two ways - is he going from the broadly compatible QuickSync solution found in, what, 5.3 forward to ShadowPlay only in 5.5 forward? If so, why? Because it's better? Because he got paid? Or am I just misreading him entirely?

I read it slightly differently. Just to clarify again, there's two aspects to recording footage - the capture phase, and the encoding phase.

The capture phase is the most problematic phase, as Alexey describes in the blog post related to 5.3. When I first started messing around with streaming, enabling screen capture in Xsplit / VH Screen Capture would drop my framerate to roughly half in any application. I didn't know why at the time, but it seems the way DirectX does rendering is the reason. To do capture without crippling your performance, you either need to use a proprietary solution like Nvidia's NVFBC and NVIFR interfaces, or you need a dedicated capture card. That's just to get the stream to begin with, and Nvidia's interfaces weren't available until just recently.

Then you need to encode that stream. Alexey says that's not an issue any more but I disagree. It's not as much of an issue if you're going to upload a video to Youtube or encode it for your own viewing later. It is a big issue if you are going to try to stream in real time, like on Twitch.

So you have the game feed from Shadowplay / Capture app / capture card, now you need to encode it. X.264 is really really good quality - it allows you to keep your bitrate lower, but the footage still looks nice. The downside is that it's CPU based, and that load scales with pixels - resolution and framerate. The combos I see most often are 720p@60fps, and 1080p@30fps. There are some people that stream 1080p@60fps, but it's select games (not too demanding ones) and there are very few viewers that can watch that footage smoothly. So for reference, an i7-3770 runs at about 80% CPU usage on all 4 cores encoding 1080p@30fps in real time using x.264 at 'faster' settings. That's on a dedicated streambox, ie not playing the actual game.

Now, for those who want to game and record on one machine, loading up your CPU like this with encoding is not stellar. If you can live with the quality drop, both Nvidia and Intel can do the encoding for you - thus removing that load from your CPU.

So basically, on the capture side, Shadowplay's stream off the interface will be available as a source for Nvidia cards in Rivatuner RTSS, meaning you should be able to encode that feed with the encoder of your choice. (if I have understood correctly) It may force you to use NVENC for encoding if that's part of Nvidia's licensing of Shadowplay.

And on the encoding side, it seems that Nvidia had weird licensing around their encoder / NVENC functionality. Third party apps couldn't use it? You could only use it to encode a Shadowplay capture? I am not clear on that part. Either way, it seems that you can now either use Nvidia's h.264 encoder on their cards, or Intel's built in quicksync h.264 encoder. Not sure if these options work with different stream sources, etc. Both look inferior compared to CPU based x.264 encoding which means you'll need to use higher bitrates, but in general I feel QuickSync encoded video could be better and Nvidia's look acceptable.

I think this may or may not be part of a general push to get QuickSync into more applications - it was quite poorly supported until recently. Open Broadcast Software does have support for using QuickSync to encode your stream (and something like DXTory for low impact capture) but as noted, QuickSync encoded game footage looks a bit rough.

Ham Sandwiches fucked around with this message at 19:56 on Dec 4, 2013

Straker
Nov 10, 2005


Josh Lyman posted:

Just listed my 2 week old 7970 GHz edition for $399 Buy It Now. Come on bitcoiners!
Just listed my barely used 7990 for... a lot, but with a Buy It Now way more reasonable than the $1300+ I'm seeing. There are other used cards already bid up over $800 and a new one with a $1200 bid on it. I haven't had any more artifacting issues but PowerColor doesn't do product registration anyway, so the 2-year warranty should be transferable easily enough even if it's not supposed to be. Now to figure out how to make games run real good with $600-900...

real_scud posted:

Let us know if you sell it because if it does go I'm going to list my 7970 with ACIII cooler on it for that much or slightly more.
There are a ton of listings with bids on them, you can definitely sell it for $350 minimum if you have the slightest inclination to upgrade, even if "only" to a regular 290 or whatever. Much more than that seems kinda ambitious, the one listing I saw over $400 was for a card with a waterblock on it, and there's at least one with a mere $450 buy it now that's just sitting there, so it seems like litecoiners think they're worth around $400.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!



Umm, just throwing this out there: x264 is just a software h.264 encoder. The only difference with NVENC/QuickSync/VCE is that the hardware h.264 encoders tend to use certain profiles and are less flexible, but they output the same format.

deimos fucked around with this message at 20:14 on Dec 4, 2013

Ham Sandwiches
Jul 7, 2000



deimos posted:

Umm, just throwing this out there: x264 is just a software h.264 encoder. The only difference with NVENC/QuickSync is that the hardware h.264 encoders tend to use certain profiles and are less flexible, but they output the same format.

Sorry if I was unclear. What I was trying to say is that you can choose profiles in x264 that will produce nicer looking video at a given resolution / bitrate at the cost of high CPU load. Hardware encoders seem to produce quite different output quality, based on how they're configured. In particular, Avermedia's seems pretty bad on their capture card and nobody uses that for encoding unless they are recording locally at ridiculous bitrates. QuickSync is a lot better than Avermedia's but people still notice the quality loss. Nvidia seems to have done a good job with theirs and very few people are bothered by Nvidia encoded Shadowplay footage compared to CPU encoded x264.

Folks were very excited about QuickSync letting them offload all encoding from their CPU, and found that there was a pretty big tradeoff when it was implemented into OBS.

Ham Sandwiches fucked around with this message at 20:11 on Dec 4, 2013

Josh Lyman
May 24, 2009





Josh Lyman posted:

Just listed my 2 week old 7970 GHz edition for $399 Buy It Now. Come on bitcoiners!
Annnd we're sold!

Actual question for the thread. The 7970GE was overkill for my needs. If my most demanding gaming is Civ 5 and Counter Strike: Global Offensive at 1440p, what would be a "good enough" card? 760?

edit: woops forgot about the parts picking thread

Josh Lyman fucked around with this message at 21:48 on Dec 4, 2013

Blorange
Jan 31, 2007

A wizard did it



I would've said just take the free upgrade to a 290, but looking online those are sold out on amazon and newegg now.

Adbot
ADBOT LOVES YOU

Straker
Nov 10, 2005


Yeah, litecoin miners bought like everything it seems. Oh well, I can wait, I'll be out of town for 10 days starting next monday.

ahem

My 7990 just sold for A THOUSAND loving DOLLARS, guess I should have set it a little higher. Too bad ebay fees eat $100, would have been nice to double my initial investment

edit: actually it looks like litecoin mining is somewhat lucrative right now, my 7990 could be making ~$850/month after power costs etc. but it's probably going to suffer from massive difficulty increases with all this new attention, and then I can't play games and my card is going to be louder and hotter than it already is, and I don't imagine litecoin exchanges behaving for very long before they start getting prosecuted, stop letting customers take their money out etc. etc.

Straker fucked around with this message at 22:18 on Dec 4, 2013

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply