Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
dog nougat
Apr 8, 2009
drat, the 290 for $400 it's really enticing. Kind of a shame about the stock cooler being a bit obnoxious. I'm very seriously considering dropping the $$ on one one the custom cooled version become available, but I wish EVGA made AMD hardware. I was looking at the Sapphire version of the 290, are they a recommended AMD card manufacturer? I'm also really tempted on waiting to see what Maxwell offers, just so I could snag an EVGA card. I recently upgraded to 2560 x 1440 and my 570 is barely cutting it.

Adbot
ADBOT LOVES YOU

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

SourKraut posted:

Well the game package ends on November 26th and so far we don't know of a replacement. I'm not sure the "features" are really worth an extra $100 - the main things are Shadowplay (which can also be done by 3rd party apps with low overhead) and Physx ( :laffo:). In that vein someone could make the same argument for TrueAudio and Mantle in the near future.

The current games are nice but once again the true value depends upon how the buyer values the bundle - for every person who think the games are great, there is likely someone who doesn't care at all about them.

I don't know, the only reason I was worried about upgrading my CPU was for 1080p Fraps, and Shadowplay makes that a non-issue for me so that's quite a bit of value right there. G-Sync is really appealing to a lot of people too, AMD doesn't have their own version of that yet and it can make any GPU appear smoother than it actually is. Even if you don't care about the bundle, you can still pitch it on eBay or SAmart for $60-ish like I'm doing.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Agreed posted:

They'd need to restructure their entire lineup. Maybe they well, I'm not predicting the future, just stating my take on things. Though I do think that a production proven thing like Shadowplay is actually a Good Feature vs. True Audio and Mantle being kind of nebulous, in-the-future-these-may-be-cool? things for AMD card owners. The nVidia Experience software is imo considerably better than AMD's raptr software, but that could be subjective.
I could see room still for an additional price-cut, given the uncertainty over Titan and how it'll slot in given the greater performance the 780 Ti. I do think Shadowplay is cool, but realistically how many people will truly use it? That's the issue with the "who knows" nature of TrueAudio and Mantle - if they do end up being implemented, each could essentially surpass the best "bonus" features nVidia has to offer, as Shadowplay is cool but not as cool as TrueAudio if it takes off. It's a big "if" though. In terms of the "experience" though, nVidia definitely has the better of the two. I don't really care for the whole raptr implementation at all.

quote:

This month we're supposed to find out a great deal more about Mantle, including announcements about other games in development using it, so that could be a significant factor that alters the value proposition once again (or not, just a big unknown at the moment). I love the idea of TrueAudio, but it's something that is pretty easily countered if necessary, like when AMD waaaay back when put audio over HDMI on their cards and nVidia just did that too, and thus could end up just kinda fragmenting the market unnecessarily.

Here's a tangential thing that's bugging me - AMD seems like they're in a rough spot with these proprietary technologies they're working on, in that they would enjoy more success and broader adoption if they weren't proprietary (a lesson they learned too late with MLAA vs. FXAA back when shader-based AA was hot new poo poo), but at the same time they've got a lot of pressure on them to keep them proprietary in the hopes that broader adoption will end up resulting in a greater market share. I know they've upped efforts with developers lately, certainly their console wins across the board will help grease the wheels there, but it's not a panacea and they can certainly harm their own technology's chances at broader adoption while trying to capitalize exclusively on them.
Yeah, I do think Mantle is almost one of those "It's going to help AMD in a big way, or be a complete dud" technologies. The promising aspect is that AMD could potentially leverage its console monopoly to get developers to utilize it more widely and, if they do, they'll have quite the leg up on nVidia. Brand name or not, people going between PC and console only have to see that their favorite titles have "Mantle by AMD" to have it potentially impact their hardware purchases, especially if it does offer a noticeable performance improvement PC-side.

AMD isn't alone though in terms of technologies that didn't really go anywhere. nVidia's whole 3D Vision concept didn't really get going that well (I know it's still "used", but that's like saying some crazy people still use pagers). So we'll just have to see how it is handled, and what, if anything, nVidia can truly counter it with. Possibly a resurgent NVAPI API, but we'll see. Without any nVidia in the console space they're potentially at a huge disadvantage.

quote:

Meanwhile, AMD is still in the position of engaging in price pressure at all tiers rather than being able to unequivocally claim performance superiority. Yes, I know that the cards are apparently good overclockers, which probably means they've got a GHz edition refresh lined up for some appropriate time, but if the 780Ti is a fully enabled GK110 for videogames, it's going to have better performance than an R9 290x, and given how few people actually use the very expensive cards, all that matters there is they can say they have the fastest card, not that they are selling a shitload of the fastest card.
I'm not sure that unequivocal performance superiority is really a thing besides at the top end. Tiers are subjective essentially, because both nVidia and AMD can slot their cards at an equal price tier or between each other's price tiers based upon performance, features, bundles, etc., and so it makes it difficult to really say that one has performance superiority over the other. The 7950 essentially trades blows with the 760, while being a little cheaper and having a better bundle. The 280X easily goes by the 760, but also costs $50 more. The 770 trades blows with, but is likely a little faster than, the 280X but also start at $30 more and has less memory. The 290 non-X is easily the best performance card available for the price: it easily goes past the 770, currently has the same amount of memory at essentially the same price as a 770, but with performance equivalent to the $100 more stock 780. The price difference will be less once custom cooled cards come, but it'll still be a big difference. The 780 is kinda in a quagmire - it's not any better of a performer over the 290, but a lot cooler, quieter, and has the nice game bundle currently. Then again, for only $50 more (MSRP0 you can get a higher-performing 290X, but the 290X is a bad performance:price card now that the 290 is out. Then we get to the 780 Ti at $700 that'll be the best-performing card, but also costs $150 more than the 290X. So yeah, nVidia will have performance superiority in a single true tier, the ultra-high end, that probably a hundredth of a percent of the market shops in. I'm not really sure superiority really matters in that tier except to those who can shop in it.

Otherwise, what one side might view as "price pressure", the other side could equally say is just price correction due to nVidia's price gouging during the 700 series of cards. Because a $650 GTX 780 was definitely price gouging to a lot of people.

Ultimately I'm just tired of all of the back-and-forth right now: I want a card for my 1440p monitor, and I had myself essentially convinced on the MSI Lighting 780, but the possibility of a custom cooled 290 non-X for quite a bit cheaper has caused me to once again second-guess my choice. I'm still leaning toward the 780, but hoping we might see some type of additional price cut.

veedubfreak
Apr 2, 2005

by Smythe

Michymech posted:

http://www.youtube.com/watch?v=J3Vi8gPsCbk
Thats the noise that I get and I got the OK to have my card sent back to be checked out and get a refund as its just really annoying to hear, the fan noise isn't a problem because a water block will be going on it but the coil whine will just be worse with out the fan there

Honest to god, I must be deaf. I could not hear any coil whine in that video. So ya.. Anyhoo, I am returning my 290x and I ordered 2 XFX 290s and 2 EK water blocks :) Will have 1 waterblock Thursday for sure, but not sure how long before the 2nd one comes in as they only had 1 left in stock.

fookolt
Mar 13, 2012

Where there is power
There is resistance
So is the R9 290(x) really that loud and hot? Or was it just a case of how it was configured during the benchmarks?

veedubfreak
Apr 2, 2005

by Smythe
AT 60% or more fan speed it's loud.

Magic Underwear
May 14, 2003


Young Orc

fookolt posted:

So is the R9 290(x) really that loud and hot? Or was it just a case of how it was configured during the benchmarks?

I don't think its bad at all if you keep it in quiet mode. Uber mode starts to get pretty loud and if you hack the fan to max out at 100% then it becomes a jet engine.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

veedubfreak posted:

Honest to god, I must be deaf. I could not hear any coil whine in that video.

I don't think coil whine is a noise that records very well. Cell phone microphones don't exactly have a great frequency range.

Boar It
Jul 29, 2011

Mesmerizing eyebrows is my specialty
All this talk about Mantle and how the current Nvidia cards aren't "next gen" and whatnot is making me not as eager to buy a 770. Thing is I doubt the hypthetical 870 will be as cheap as the current 770 card. At least not for a while. But it is a new architecture so I guess there will be a much bigger difference compared to the 600 and 700 series and apparently Mantle might work with "other gpu brands" in the future? Why is my gpu failing now god drat it. I really want to hop over to Nvidia and see what that is like but then it isn't "next gen" and other buzzwords.

But isn't Mantle just an API? Meaning that it won't be dependent on hardware? So older cards could in theory support it as well. I doubt it will happen though since both Nvidia and AMD love to put in arbitrary barriers.

And apparently Star Citizen will use Mantle. Being able to run that as good as possible would be great. Sucks that we still have no idea how well Mantle will actually perform.

Boar It fucked around with this message at 00:45 on Nov 6, 2013

GrizzlyCow
May 30, 2011

Torabi posted:

All this talk about Mantle and how the current Nvidia cards aren't "next gen" and whatnot is making me not as eager to buy a 770. Thing is I doubt the hypthetical 870 will be as cheap as the current 770 card. At least not for a while. But it is a new architecture so I guess there will be a much bigger difference compared to the 600 and 700 series and apparently Mantle might work with "other gpu brands" in the future? Why is my gpu failing now god drat it. I really want to hop over to Nvidia and see what that is like but then it isn't "next gen" and other buzzwords.

But isn't Mantle just an API? Meaning that it won't be dependent on hardware? So older cards could in theory support it as well. I doubt it will happen though since both Nvidia and AMD love to put in arbitrary barriers.

And apparently Star Citizen will use Mantle. Being able to run that as good as possible would be great. Sucks that we still have no idea how well Mantle will actually perform.

Mantle is tied to AMD's GCN architecture. If you want to use Mantle, you'll have to have a GCN 1.x card. The reason that Mantle will deliver better performance is that it is designed around AMD's GCN. If you get a 770, you won't be able to use. It's not like the 770 will become unable to play Star Citizen anyway.

Teenage Fansub
Jan 28, 2006

I'm bootcamping Windows 8 on an iMac, which has an AMD mobility GPU.
Just wondering if it's possible and safe to run a regular non-mobility beta driver on an 'M' card.

Digital Jesus
Sep 11, 2001

I'd be surprised if it actually lets you use it. Normally it'll say that there's no valid card in the system.

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug
Any good reviews of the AMD 240/250 cards? I got an APU right now I work off of, it's okay but drat does it suck with KSP/DX:HR/CS:go at 1920x1080

Hoping to pick
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127764
up friday

Dilbert As FUCK fucked around with this message at 03:23 on Nov 6, 2013

Bloody Hedgehog
Dec 12, 2003

💥💥🤯💥💥
Gotta nuke something
I wouldn't worry to much about Mantle, it's not going to go anywhere and will basically end up as another Physx. There'll be a few developers that produce AAA games that take advantage of Mantle (like Batman and Phsyx), but it's not going to be accepted very widely. The last thing developers and publishers want to do is fracture their potential audience, and that's exactly what would happen if you started producing games where people in one video-card camp are getting a vastly superior game.

That is to say, if Mantle ends up as purely an AMD only option. If it ends up being able to be used by both AMD and Nvidia, I could see wider adoption, but honestly, even in that case I don't see it becoming "the next big thing".

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

dog nougat posted:

drat, the 290 for $400 it's really enticing. Kind of a shame about the stock cooler being a bit obnoxious. I'm very seriously considering dropping the $$ on one one the custom cooled version become available, but I wish EVGA made AMD hardware. I was looking at the Sapphire version of the 290, are they a recommended AMD card manufacturer? I'm also really tempted on waiting to see what Maxwell offers, just so I could snag an EVGA card. I recently upgraded to 2560 x 1440 and my 570 is barely cutting it.

AMD has quite a few different companies that use their cards - Sapphire seems to be a big one, but Gigabyte, MSI, Powercolor, Asus and a few others all have variants of AMD cards. I have a Sapphire HD 6970 with the dual fan/heatpipe cooler and it's actually a LOT quieter than the reference design, and cooler to boot. I haven't seen it get above low 70s when I game and usually keep the fan speed around 60-65% tops, usually around 40% for everyday use.

GrizzlyCow
May 30, 2011

Dilbert As gently caress posted:

Any good reviews of the AMD 240/250 cards? I got an APU right now I work off of, it's okay but drat does it suck with KSP/DX:HR/CS:go at 1920x1080

Hoping to pick
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127764
up friday

Hardware Heaven is the only site to review one of those cards. Here. The R7 250 should be around a Radeon HD 7730 in performance (between the GT 640 and Radeon HD 7750), so it may be adequate for what you want. I'd get a 7750 instead.

I'll find a nice analogue for the R7 240. I don't think it is has had any reviews.

SlayVus
Jul 10, 2009
Grimey Drawer
Anyone have a preferred program, preferably free, that can edit ShadowPlay videos? I tried the default windows movie maker program, but it can't read them. They play just fine in VLC with k-lite media codec pack.

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug

GrizzlyCow posted:

Hardware Heaven is the only site to review one of those cards. Here. The R7 250 should be around a Radeon HD 7730 in performance (between the GT 640 and Radeon HD 7750), so it may be adequate for what you want. I'd get a 7750 instead.

I'll find a nice analogue for the R7 240. I don't think it is has had any reviews.

Really wonder how much of that is driver related vs developed driver for a card.

Ahh the 250 doesn't require a 6 pin...

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

Teenage Fansub posted:

I'm bootcamping Windows 8 on an iMac, which has an AMD mobility GPU.
Just wondering if it's possible and safe to run a regular non-mobility beta driver on an 'M' card.

In the past, the mobility drivers worked for the AMD chips in the iMacs. From what I remember, the desktop versions won't even install. Don't stick with Apple's BootCamp drivers though because they tend to be very old.

Arzachel
May 12, 2012

Bloody Hedgehog posted:

I wouldn't worry to much about Mantle, it's not going to go anywhere and will basically end up as another Physx. There'll be a few developers that produce AAA games that take advantage of Mantle (like Batman and Phsyx), but it's not going to be accepted very widely. The last thing developers and publishers want to do is fracture their potential audience, and that's exactly what would happen if you started producing games where people in one video-card camp are getting a vastly superior game.

That is to say, if Mantle ends up as purely an AMD only option. If it ends up being able to be used by both AMD and Nvidia, I could see wider adoption, but honestly, even in that case I don't see it becoming "the next big thing".

Mantle isn't going to be used by Nvidia much like Physx isn't going to be used by AMD. What AMD are betting the farm on and what differentiates Mantle from Glide, Physx, etc. is that the Api would be implemented in the bigger engines to be closely compatable with the console paths, thus porting a multiplat to Mantle would have a low oportunity cost, since you've already written the code once. This rarely ends up as straightforward in practice, so well have wait and see, but I'd say Mantle has a far greater chance to stick than GPU accelerated Physx ever did.

Edit: 20nm GPUs are at the very least 9 months off (expect a year), and 20nm planar itself looks somewhat underwhelming, so there's no reason to think that Maxwell/* Islands are going to obselete your shiny new GPU over night.

Arzachel fucked around with this message at 08:36 on Nov 6, 2013

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai
Regarding Mantle, I keep hearing rumors that it's apparently supposed to be open (meaning anyone can use it, including NVIDIA), but that's all they are at this point. Rumors.

I'm honestly not expecting it to be open though, at least not initially. Just look at it from a business perspective. With "hot" titles like Battlefield 4 supporting it and touting "drastically" better performance with Mantle, it would be silly for AMD to allow NVIDIA to use it straight away because it would negate any sort of boost in sales they would have. Right now, AMD has made Mantle to only benefit themselves.

I would personally love to see it go completely open at a later date though. Or at least usher in more low level access on APIs like OpenGL.

When it comes to Mantle and G-Sync, I personally don't find them to be buying points just yet for either party. Give them both some time to mature, and we might have something. As for now, buying a GPU based on features we know very little about is just stupid and pointless.

Teenage Fansub
Jan 28, 2006

mayodreams posted:

In the past, the mobility drivers worked for the AMD chips in the iMacs. From what I remember, the desktop versions won't even install. Don't stick with Apple's BootCamp drivers though because they tend to be very old.

I was already using a beta mobility driver because the bootcamp one didn't come with Catalyst and the main mobility drivers disable most Catalyst features.
Problem is I updated to Win8.1 and didn't disable driver updates, so now I have the latest regular mobility driver installed with barebones Catalyst.
The new 64bit mobile beta has some hosed 0kb download, which is why I was wondering if I could just use the main driver.

I remember back in the day with an old XP laptop I did something to fool the system into thinking my mobile GPU was a regular version so I access more features.

The_Franz
Aug 8, 2003

Arzachel posted:

Mantle isn't going to be used by Nvidia much like Physx isn't going to be used by AMD. What AMD are betting the farm on and what differentiates Mantle from Glide, Physx, etc. is that the Api would be implemented in the bigger engines to be closely compatable with the console paths, thus porting a multiplat to Mantle would have a low oportunity cost, since you've already written the code once. This rarely ends up as straightforward in practice, so well have wait and see, but I'd say Mantle has a far greater chance to stick than GPU accelerated Physx ever did.

Except the consoles don't use Mantle or anything resembling it and probably never will. You have DX11 on the Xbone and the PS4 lets you use OpenGL and whatever Sony's proprietary API is. Right now, you have to write a Mantle render path strictly for AMD PCs.

roadhead
Dec 25, 2001

The_Franz posted:

Except the consoles don't use Mantle or anything resembling it and probably never will. You have DX11 on the Xbone and the PS4 lets you use OpenGL and whatever Sony's proprietary API is. Right now, you have to write a Mantle render path strictly for AMD PCs.

Sure but the shader assembly itself I believe is going to be identical on all three. Also since AMD is surely supplying some of the toolchain for the Xbox one and the PS4 you don't think they have at least some idea at what level this stuff is abstracted?

You don't think behind the DirectX and PS4 proprietary API it's all Mantle underneath and dev's that want to get as close "to the metal" as they can on all three platforms will use it?

Here's hoping.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

roadhead posted:

Sure but the shader assembly itself I believe is going to be identical on all three. Also since AMD is surely supplying some of the toolchain for the Xbox one and the PS4 you don't think they have at least some idea at what level this stuff is abstracted?

You don't think behind the DirectX and PS4 proprietary API it's all Mantle underneath and dev's that want to get as close "to the metal" as they can on all three platforms will use it?

Here's hoping.

I learned a valuable lesson about eating hats but if this turns out to be true I will be as upset as somebody who was forced to eat his own hat because it's absurd as poo poo and, what, the, heck, it does not work that way

GrizzlyCow
May 30, 2011

Rahu X posted:

Regarding Mantle, I keep hearing rumors that it's apparently supposed to be open (meaning anyone can use it, including NVIDIA), but that's all they are at this point. Rumors.

I'm honestly not expecting it to be open though, at least not initially. Just look at it from a business perspective. With "hot" titles like Battlefield 4 supporting it and touting "drastically" better performance with Mantle, it would be silly for AMD to allow NVIDIA to use it straight away because it would negate any sort of boost in sales they would have. Right now, AMD has made Mantle to only benefit themselves.

I would personally love to see it go completely open at a later date though. Or at least usher in more low level access on APIs like OpenGL.

When it comes to Mantle and G-Sync, I personally don't find them to be buying points just yet for either party. Give them both some time to mature, and we might have something. As for now, buying a GPU based on features we know very little about is just stupid and pointless.

Mantle is open, but it is tied to the GCN architecture. NVIDIA would not be able to benefit without altering their cards to be more like AMD's offerings. It's "open" but not really open.

Blorange
Jan 31, 2007

A wizard did it

I was under the impression that Mantle exists not to make the GPU run faster, but to make the CPU more efficient in executing calls to the graphics architecture. If your graphics card is already maxed out you're not going to see much of a benefit. On the other hand, if your CPU is capping your FPS, you could render far more discrete objects on the screen or get the boost to hit the 120hz mark. If this is the case, only people with twinned+ high end cards are really going to see the difference.

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai

GrizzlyCow posted:

Mantle is open, but it is tied to the GCN architecture. NVIDIA would not be able to benefit without altering their cards to be more like AMD's offerings. It's "open" but not really open.

I hear conflicting reports even on this. While I believe it will always be tied to the GCN architecture, a lot of tech journalists like to spin in it a way that paints it as the next OpenGL.

Not that I believe them. I'll believe what is shown during the AMD Developer Summit.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Rahu X posted:

I hear conflicting reports even on this. While I believe it will always be tied to the GCN architecture, a lot of tech journalists like to spin in it a way that paints it as the next OpenGL.

Not that I believe them. I'll believe what is shown during the AMD Developer Summit.

This is the correct attitude to have at the moment. We don't know poo poo about Mantle except that they've stated it will reduce draw call overhead by 9x compared to D3D, which in and of itself isn't outright unreasonable given that it's a known bottleneck for GPU/CPU performance, but read Professor Science's excellent post on the topic. Which I will quote here in its entirety because it's short and to the point. What are they planning to do about WDDM?

Professor Science posted:

WDDM is designed for a really specific problem--3D accelerated desktop UIs--and all of its tradeoffs are built around that. This is why you can do things like "games generally don't crash when I alt-tab" and "I can see window previews when I alt-tab" and "badly behaved drivers don't cause BSODs." It's also why you get other things like "command buffer submission to the GPU takes forever" and "compute APIs are always going to be second-class citizens." WDDM was designed for NV40/G70 class hardware ten years ago, and it shows. If you remember back in the proverbial day, there was a proposal for WDDM 2.0 that was spectacularly unrealistic, like "all hardware must support instruction-level preemption" unrealistic (to my knowledge, no GPU supports instruction level preemption). MS finally added support for any sort of preemption in WDDM 1.2 (Win8), but they haven't done anything to address things like buffer queue overhead (not since they fixed something completely horrible in Vista with something less horrible in Win7), GPU page faulting, or shared memory machines.

The thing I'm most curious about with Mantle is how it will work alongside WDDM, because upon reflection and discussion with some similarly knowledgeable folks, none of us can figure out how you could get WDDM interoperability except in one of two ways:

1. a large static memory carve-out at boot and a second Mantle-specific device node, rendering into a D3D surface
2. only run on a platform that has a GPU with reasonable preemption (at least per-wavefront) and an AMD IOMMU

Of course, they could ship Mantle in a separate driver that blatantly circumvents WDDM and that they never attempt to get WHQL'd, but that seems unrealistic.

If you look at the HSA slides from Hot Chips, the driver they propose is definitely a response to the stagnancy of WDDM, but it's also mired in some unrealistic stuff (the idea that you can return from a GPU operation to a suspended user-mode process without entering the kernel is nonsense) and some pointless stuff; a standardized pushbuffer format was tried by MS briefly in the DX5/6 timeframe, I think, and it was a travesty that vendors all rebelled against.

(i know a lot about driver models, i should really write my own sometime)

Proper mode right now is to be a little :raise: and mainly agnostic (you're always allowed to hope for whatever, obviously, especially if you are an AMD card owner, but hope isn't knowledge) toward the whole thing 'til they get the details aired out. Everything stated so far is essentially just PR, and some of it quite misleading, especially by "analysts" pointing out some version of "they've already got Mantle on the consoles! It'll be an easy switch to PC!" when that's just not true based on what we actually know about the console development APIs.

The_Franz
Aug 8, 2003

Blorange posted:

I was under the impression that Mantle exists not to make the GPU run faster, but to make the CPU more efficient in executing calls to the graphics architecture. If your graphics card is already maxed out you're not going to see much of a benefit. On the other hand, if your CPU is capping your FPS, you could render far more discrete objects on the screen or get the boost to hit the 120hz mark. If this is the case, only people with twinned+ high end cards are really going to see the difference.

Their big claim is that it has significantly reduced draw call overhead vs Direct3D. I want to see a benchmark of Mantle vs a modern OpenGL pipeline as I suspect that the differences will be much smaller, if they exist at all.

karoshi
Nov 4, 2008

"Can somebody mspaint eyes on the steaming packages? TIA" yeah well fuck you too buddy, this is the best you're gonna get. Is this even "work-safe"? Let's find out!

The_Franz posted:

Except the consoles don't use Mantle or anything resembling it and probably never will. You have DX11 on the Xbone and the PS4 lets you use OpenGL and whatever Sony's proprietary API is. Right now, you have to write a Mantle render path strictly for AMD PCs.

Mantle uses DirectX's HLSL language for shaders, so there's that.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Is there a way to set a global framerate cap with NVidia? Because I want to run Final Fantasy at 120hz, but the in-game menu only has framerate caps for 30 and 60 fps, and if I uncap it my GTX 770 goes right for infinity FPS and revs up like a vacuum cleaner.

PC LOAD LETTER
May 23, 2005
WTF?!

GrizzlyCow posted:

If you want to use Mantle, you'll have to have a GCN 1.x card.
Supposedly Hawaii is GCN 2.0 and Bonaire was GCN 1.x but none of that "GCN 2.0, 1.x, blabla" is official branding.

I don't think AMD has said officially how long they plan on supporting Mantle but if history repeats itself a la their VLIW GPU's they probably have another 2-3 yr of GCN iterations in the works at a minimum. Possibly longer if you consider how the foundries have slowed releasing new processes plus how those new processes don't have the same level of advantages as previous years shrinks.

AMD and nV both will be forced to put more work into designing their GPUs since now they can't rely on a die shrink in 6-8 months giving them another 30%+ performance with the same uarch like the "old days".

Gwaihir
Dec 8, 2009
Hair Elf

Zero VGS posted:

Is there a way to set a global framerate cap with NVidia? Because I want to run Final Fantasy at 120hz, but the in-game menu only has framerate caps for 30 and 60 fps, and if I uncap it my GTX 770 goes right for infinity FPS and revs up like a vacuum cleaner.

Not sure about nvidia drivers themselves, but usually video recording programs like DxTory have built in adjustable FPS caps. I know DXtory you can set it to whatever you want, maybe MSI afterburner and FRAPS will do the same thing?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

nVidiaInspector and EVGA Precision both have FPS limiters. I'd guess MSI Afterburner does too just because it's the same guy who makes Precision so why would one have a feature and the other lack it? I haven't tested it in a while but it used to be great for making games play nice with my monitor's refresh rate with no tearing or input lag. Then for a while it sort of sucked and tended to over-throttle the card. But hopefully that's been fixed.

Aside: anybody know what the newer NV Control Panel option "AA Fix" actually does? :confused:

Edit: For that matter, the recently added VSync Smooth AFR Behavior. I am going to read up and see what nVidia has been dicking around with lately, these options popped up without any real documentation and I'd like to know what they're for. Guessing AFR is mainly for SLI setups. But I don't like guessing unless I have to.

Agreed fucked around with this message at 20:31 on Nov 6, 2013

DaNzA
Sep 11, 2001

:D
Grimey Drawer

Agreed posted:

nVidiaInspector and EVGA Precision both have FPS limiters.

Whoa thanks for that. Kinda just gave up before and let the game run up to 300fps for no good reason while pegging the GPU at maximum.

fookolt
Mar 13, 2012

Where there is power
There is resistance
Does Battlefield 4 support ShadowPlay?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Other way around. ShadowPlay supports basically everything.

featurecreep
Jul 23, 2002

Yes, Robinson, take the Major, the Robot, your wife and kids... but leave Will for my plea-- his education.

SlayVus posted:

Anyone have a preferred program, preferably free, that can edit ShadowPlay videos? I tried the default windows movie maker program, but it can't read them. They play just fine in VLC with k-lite media codec pack.

Seconding this question. I tried using Yamb to just change the "Pixels Aspect Ratio" so my videos don't look squashed (16:9->16:10 = bleh), but it isn't cutting it.

Adbot
ADBOT LOVES YOU

fookolt
Mar 13, 2012

Where there is power
There is resistance

Factory Factory posted:

Other way around. ShadowPlay supports basically everything.

Hm, I can't seem to get it to show up or work when I'm playing Battlefield 4 :(

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply