Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
PC LOAD LETTER
May 23, 2005
WTF?!

Ignoarints posted:

Well... that's what they're saying.
Shaocaholica's 'card will ramp clocks and voltage until it hits 95C/max_clock/max_volts' comment is pretty much how I read it.

Shaocaholica posted:

Reads pretty straight to me.
I remember there was a huuuuuuge shitstorm of a thread on TR where everyone thought they meant 'run at 95C all the time, gently caress heat and YOU lawl'.

Adbot
ADBOT LOVES YOU

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party
cough I remember the last time somebody thought a loud cooler or high temps was a feature.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

PC LOAD LETTER posted:

OK maybe I found it.

Definitely some marketing BS in there and they worded their responses oddly but I'm still not reading it as them saying 'run as hot as possible all the time'.

<my thoughts on gamers who build their own PCs go here>

I'd like to think their target market (PC gamers who mostly build their own PC's) would shrug and say 'OK gimme!' but maybe I'm optimistic.

The thing is...

"95C is the optimal temperature that allows the board to convert its power consumption into meaningful performance for the user. Every single component on the board is designed to run at that temperature throughout the lifetime of the product. If you throttle the temperature down below that threshold, then the board must in turn consume less power to respect the new temperature limit. Consuming less power means lowering vcore and engine clock, which means less performance."

... unless I just don't grasp thermodynamics and unless they're using magic transistors, this is a big ol' pile of bullshit. The first sentence and the last sentence in that statement have absolutely nothing to do with each-other, and the ones in the middle are spinning faster than a stock 290X cooler in Uber mode (:haw:). Oh no! It must consume less power because it is cooler! Because V=IR even with their transistors, and raising the temperature raises the resistance which raises the required voltage to operate at a given frequency which means that this is marketing, not truth.

Which he then admits, because jesus he sure talked himself into a corner there, in the followup question:

"A better cooler increases the watts of heat that the product can emit before the 95C equilibrium temperature is reached. In turn, this raises the maximum permissible clockspeed (within the limit of product TDP) the board can sustain."

Oh, so wait, if it runs cooler in fact it will reach the highest possible clockspeed within TDP faster. That's neat, because that makes total sense. It does not, however, follow from your prior statement, like, at all.

Gotta stick with my interpretation here, that was some ol' bullshit and they knew it but who in the world can say it, y'know? I think gamers who build their PCs are the most likely of anyone to want a quieter computer, see the lengths that [H] folks go to, or for that matter the fact that he acknowledges immediately after that sentence, in the same response, that people are flocking to aftermarket coolers that - forgive me if this sounds crass, I don't mean any offense - prove the bullshit he's putting out to be bullshit, since they're quieter, cooler, and more desirable. Or, in his words,

"Users with the Accelero coolers are finding they're reaching the clockspeed limit of the product at a lower temperature limit than 95C. So you can see how the experience is very customizable and interesting for enthusiasts."


I don't see any other way to read that. They knew their cooling solution was kinda balls - it works, but it's louder than people like, and hotter than people like, and doesn't allow for the kind of overclocking that people like and which vendors specifically facilitate. And the "optimal temperature" of 95ºC is more like the maximum temperature the card will run at (safely, sure, but throttling), and the card heads straight for it because that's what PowerTune tells it to do - a standoff between the cooler's marginal capability, the GPU and other components' temperature output, and clockspeed. And when it comes down to it, clockspeed blinks, while everything else will stay at 95ºC. That's the reference cooler in a nutshell, on my read, and the basis for my "hot as possible" interpretation and the mega-:raise: which I can't help but throw their way in positing that as somehow a good thing, especially when the competitor's product outmatches it by a great deal and uses an objectively superior cooler.

Granted the latter has a lot of other things going on, too, like lower transistor density and generally better efficiency in terms of clock per watt (and I know that clock per watt does not equal performance per watt, but it's all in the TDP equation, y'know?).

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

HalloKitty posted:

The MSI 280X with the two giant fans is very quiet, but those fans leak lubricant and seize up, but that's actually a cooler used by MSI on NVIDIA cards too. It's stupid, it started leaking crap from day one, and was noticeably stiffer by a couple of months. RMA'd and sold it in the end and went back to my unlocke 6950, when I realised I didn't care about more sparkly settings, I just need to save money.

The unlockable 6950 2GB has been a fantastic card for me, and represents a well timed purchase. (Much like the time I bought a stack of 6 2TB drives a month before the Thai floods). Oh, and the 6950 I have is also MSI Twin Frozr, but different fans, and they're as smooth as the day it was put in.
Well that's terrible news for me since I just picked up that 280X. Might go ahead and put the Accelero on it ahead of time since I'd rather do it before it's been running for awhile.

Definitely going to hold onto my unlocked 6950 as well just in case since it's been extremely reliable so far. That was a well-timed purchase that made up for the 850pro I had that was laser-cut and wouldn't unlock. ATI was nice enough to replace the card for me after the core got mysteriously cracked at one point at least.

I also picked up 6 1TB drives right before the floods too. Definitely got those at the right time.

future ghost fucked around with this message at 05:32 on Jun 7, 2014

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
Does this MSI leaking fans thing affect all MSI cards or just AMD ones? Should I be worried about my 780ti?

beejay
Apr 7, 2002

This thread seems to indicate that it is cards that were used for mining etc. causing the fans to run full speed all the time. The problem has since been fixed and so only cards from 2013 should be affected.

I'm sure people who weren't mining probably ran into the issue too and I'd guess that eventually they would run into the same problem but people running them flat out ran into it first, which I guess is good because it let them get the problem fixed early.

BurritoJustice
Oct 9, 2012

The Lord Bude posted:

Does this MSI leaking fans thing affect all MSI cards or just AMD ones? Should I be worried about my 780ti?

Any MSI card manufactured after November 2013 should not have the issue. The issue only occured in the cards before then if the fan was kept at 100% for very extended periods of time. MSI also offers a free replacement of your card if it is affected by the issue. So nah, don't worry about it in the slightest.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

cisco privilege posted:

Well that's terrible news for me since I just picked up that 280X. Might go ahead and put the Accelero on it ahead of time since I'd rather do it before it's been running for awhile.

The cooler was great while it worked, and I got an rma without hassle.

Only one fan went. Who knows why. Mine was bought end of last year, so it was in the dodgy batch of fans.

The msi cards otherwise seem great, so I wouldn't worry unduly.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Yeah, as far as I've heard - for what it's worth, heh, but I do look for this kinda crap, so take it as you will - but it seems like MSI is a very good company to deal with when their cards have an issue that is most definitely their fault. Maybe not as "we'll probably replace your card if you broke every VRM and burnt your core so badly that it melted through the PCB so long as you can fit the reference cooler back onto it before you ship it to us" crazy as EVGA, but good, from what I've read.

Shaocaholica
Oct 29, 2002

Fig. 5E
Guh I just want to give Galaxy my monies for a 750Ti low profile but the drat card is only available from their webstore which is paypal based currently down because ???. My world for a low profile 750 Ti.

Wistful of Dollars
Aug 25, 2009

Speaking of Galaxy, I'm digging this.

Shaocaholica
Oct 29, 2002

Fig. 5E

Is something like that cheaper than getting the block separate?

Wistful of Dollars
Aug 25, 2009

Shaocaholica posted:

Is something like that cheaper than getting the block separate?

Dunno, have to see what the prices are like when it goes to market. Looking at EVGA's hydrocopper it probably won't be any cheaper, tbh. I could have linked the air cooled version; I'm just a fan of the white PCBs. Also, they seem to be pretty good over clockers (it has a massive PCB like the classified).

Shaocaholica
Oct 29, 2002

Fig. 5E
So if I replace the tim and tape on a 290x while keeping the stock cooler(for now), what tim/tape should I use? Right now I already have NT-H1 which should be fine for the core but what about tape/pads? I have fujipoly but it might be too thick (1.5mm). Is there something better I should be using? Other thicknesses I should buy?

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Shaocaholica posted:

So if I replace the tim and tape on a 290x while keeping the stock cooler(for now), what tim/tape should I use? Right now I already have NT-H1 which should be fine for the core but what about tape/pads? I have fujipoly but it might be too thick (1.5mm). Is there something better I should be using? Other thicknesses I should buy?

For what it's worth, I tested the stock pads on VRAM and had no issues. A lot of people doing the bracket mod use the stock pads on the memory without issues. I replaced the VRM pad with fujipoly extreme.

Shaocaholica
Oct 29, 2002

Fig. 5E

deimos posted:

For what it's worth, I tested the stock pads on VRAM and had no issues. A lot of people doing the bracket mod use the stock pads on the memory without issues. I replaced the VRM pad with fujipoly extreme.

I guess pads are for the most part reuseable if they're not soiled or damaged?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
gently caress, this is old news but I somehow missed it: Support for Coverage Sample Anti-Aliasing (CSAA) has been removed in Maxwell (GTX 750) and later cards, and will be disabled on earlier cards if paired with a Maxwell-based GPU. This is awful, CSAA was my favorite AA mode because it provided for very effective AA with high sharpness and low performance overhead. 16X CSAA looked better than 8X MSAA with the performance of 4X MSAA. Coverage samples were also important for TXAA, as they allowed for better AA with reduced blur effect (not that I know that any games actually implemented this).

I think the key reason to remove CSAA (aside from any benefits from simplifying the ROPs) is that most aliasing today isn't on geometry, but textures, transparencies, or shader output. As a result, developers are focusing on shader-based AA, with ROP-based AA mostly used as a pre-filter to improve effectiveness. For example, 2X MSAA isn't effective enough to be visually pleasing, and FXAA removes a lot of detail if turned up high enough to remove aliasing alone, but if do 2X MSAA first you can use much less aggressive FXAA settings that dont visually affect texture detail but provide better AA performance.

An obsessive part of me is seriously considering picking up a GTX 780 Ti 6GB if they ever come down in price and holding out for the long haul, or just giving up and going AMD. CSAA just looks so incredibly good on DX9 titles like Source games, that's practically what it's made for :(

Shaocaholica
Oct 29, 2002

Fig. 5E
Why remove it? Could it have been that much of a hardware cost? Are all current forms of AA hardware based?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
How does it compare to SMAA? That's definitely my favourite for that combination of looks/performance. FXAA is blurry garbage compared.

Ignoarints
Nov 26, 2010

Alereon posted:

gently caress, this is old news but I somehow missed it: Support for Coverage Sample Anti-Aliasing (CSAA) has been removed in Maxwell (GTX 750) and later cards, and will be disabled on earlier cards if paired with a Maxwell-based GPU. This is awful, CSAA was my favorite AA mode because it provided for very effective AA with high sharpness and low performance overhead. 16X CSAA looked better than 8X MSAA with the performance of 4X MSAA. Coverage samples were also important for TXAA, as they allowed for better AA with reduced blur effect (not that I know that any games actually implemented this).

I think the key reason to remove CSAA (aside from any benefits from simplifying the ROPs) is that most aliasing today isn't on geometry, but textures, transparencies, or shader output. As a result, developers are focusing on shader-based AA, with ROP-based AA mostly used as a pre-filter to improve effectiveness. For example, 2X MSAA isn't effective enough to be visually pleasing, and FXAA removes a lot of detail if turned up high enough to remove aliasing alone, but if do 2X MSAA first you can use much less aggressive FXAA settings that dont visually affect texture detail but provide better AA performance.

An obsessive part of me is seriously considering picking up a GTX 780 Ti 6GB if they ever come down in price and holding out for the long haul, or just giving up and going AMD. CSAA just looks so incredibly good on DX9 titles like Source games, that's practically what it's made for :(

Out of curiosity is the 6gb 780ti expected to better than the 3gb? I had high hopes for the 780, but it seems like it runs into the same issue as all the other nvidia cards with double the ram. At least, up until a month ago

fromoutofnowhere
Mar 19, 2004

Enjoy it while you can.
Anyone know of where I can find some information on a evga gtx 750ti superclocked 2g W/ACX? I'm trying to figure out how high I can OC it without having to go through the trial and error, but I can't even find any information online from anyone that owns one.

Edit: It's an evga. It's weird, It's not even on EVGA's website. They list a FTW version, but mine is not listed at all.

fromoutofnowhere fucked around with this message at 21:43 on Jun 7, 2014

Shaocaholica
Oct 29, 2002

Fig. 5E
Well I just won a miner 290x for $300 shipped. Yay me!

Josh Lyman
May 24, 2009


Shaocaholica posted:

Well I just won a miner 290x for $300 shipped. Yay me!
Link to auction?

Shaocaholica
Oct 29, 2002

Fig. 5E

Josh Lyman posted:

Link to auction?

Ok ok $305 shipped.

http://www.ebay.com/itm/151319724047

BurritoJustice
Oct 9, 2012

El Scotch posted:

Dunno, have to see what the prices are like when it goes to market. Looking at EVGA's hydrocopper it probably won't be any cheaper, tbh. I could have linked the air cooled version; I'm just a fan of the white PCBs. Also, they seem to be pretty good over clockers (it has a massive PCB like the classified).

The hydrocopper cards also come with lovely blocks, whereas this Galaxy card comes with an EK block. EK makes the best blocks in the business.

Edit: Well I've just read that EVGA is using EK blocks for their hydrocopper cards from the TitanZ onwards. Which will no doubt be an improvement from the old swifttech blocks that were renowned for being very restrictive and bad for VRM cooling.

BurritoJustice fucked around with this message at 23:37 on Jun 7, 2014

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

HalloKitty posted:

How does it compare to SMAA? That's definitely my favourite for that combination of looks/performance. FXAA is blurry garbage compared.

SMAA is primarily a shader-based AA technique. Apples to apples, SMAA is way, way better but has more of a performance hit when turned up - because it CAN be turned up quite a lot more than FXAA can. They more or less stopped development on FXAA when Timothy Lottes started working on TXAA instead. Which really sucks, because the iteration of FXAA that he was working on at the time of that switch was near completion and looked really fantastic, while still coming in at a performance hit below 4xMSAA, which TXAA can't claim. Also, it didn't - as I understand it - need to be specifically coded into a game's internal MSAA in order to function. TXAA basically hitches a ride on MSAA to do what it does. Some people don't even like what it does, but I'm quite fond of it, very cinematic; of course, that was Tim Lottes' goal with the discarded final iteration of FXAA as well. You may be able to find some screenshots from RAGE (which was basically a big ol' nVidia playground and a lot of tech testing went on with that game) from before he scrubbed his blog when he moved on from nVidia to Epic's rendering team.

I don't really see how they can get rid of coverage sampling entirely, that doesn't make sense. If I had to guess what's going on behind the tight-lipped release there, it's removing the color/Z/stencil sampling aspect. Front, Deferred, Deferred+, Tiled, whatever can all still make great use of MSAA techniques, they just have to be implemented specifically, and MSAA relies on coverage sampling too. If anything I feel like "CSAA" is misnamed as "Coverage Sampling AA," since the cool different thing it does is focus on the aforementioned color/Z/stencil samples as a means to more efficiently find and eliminate jaggies. Doesn't play very nicely with the megashader technologies being used today, though, and can break lighting entirely, so I'm not altogether surprised to see it go. I actually never used it, myself, but I do agree that it has a certain crisp quality to it on DX9 games... But we're not in that era anymore, which, overall, is really a good thing.

Shaders are the poo poo going forward and that's just kinda how it is, yanno?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Ignoarints posted:

Out of curiosity is the 6gb 780ti expected to better than the 3gb? I had high hopes for the 780, but it seems like it runs into the same issue as all the other nvidia cards with double the ram. At least, up until a month ago
For me it's almost purely a usable lifespan thing. I think that if I buy a GTX 780 Ti 3GB I'll retire the card because 3GB isn't enough VRAM for the settings I want to play at long before any other aspect of the card or GPU stops being enough or driver support ends. Baked into this is my belief that VRAM requirements are back to scaling at the normal rates, now that the long reprieve we had due to last-gen consoles and their fixed <512MB VRAM is done. While no one buys a GTX 780 Ti for value, spending that much on a card that I'll only be able to use as long as 3GB VRAM is enough just seems like throwing money away to me.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

For me it's almost purely a usable lifespan thing. I think that if I buy a GTX 780 Ti 3GB I'll retire the card because 3GB isn't enough VRAM for the settings I want to play at long before any other aspect of the card or GPU stops being enough or driver support ends. Baked into this is my belief that VRAM requirements are back to scaling at the normal rates, now that the long reprieve we had due to last-gen consoles and their fixed <512MB VRAM is done. While no one buys a GTX 780 Ti for value, spending that much on a card that I'll only be able to use as long as 3GB VRAM is enough just seems like throwing money away to me.

Really just depends on your definition of "usable lifespan." In your specific case, since you're apparently wanting to continue to use a specific type of AA that really pleases you (and that's as legitimate as any other reason to buy a luxury product like this, so don't take it badly, haha), I can understand wanting to go with a higher VRAM version of the really good card we have now, but I still think that by the time games start to both

1. be optimized well for

and

2. contingently, thus take genuine advantage of

more VRAM than comes on a 780Ti, the GPU is going to be bottlenecking you more than the VRAM amount. I fully agree that it's awesome that our stupid long adherence to the necessity of console-first thinking for the very limited last generation is over. I have faith that we'll see some really cool stuff coming down the compute pipelines (get it) used alongside resources that work better when resident in memory than not. But I don't think that the 780Ti is your answer. I feel like ultimately you're going to have to compromise on the AA preference if you want to play at the settings you're talking about. I also think that you shouldn't look at that as a bad thing, since while CSAA is (was?) a neat idea, it's not the end of the world to lose it, and it has some real problems thanks to the reliance on color/Z/stencil sampling in an age where that's not gonna get you nearly as far. There shouldn't be very many, if any, more DX9 games made, from what I can tell anyway. They'll either be really far below your system's capabilities (huge surge in indie title development that doesn't focus on graphics intensity and use engines like Unity, etc., that can be optimized and have their own quite capable AA implementations in-engine for the more ambitious devs)... or exceed them, in which case you've got a whole bunch of VRAM but a GPU that can't keep up.

It's an old song there, of course, but the AA preference thing, that I understand. Bummer, man. I wish that Tim Lottes had finished the next/final iteration of FXAA because it was gonna be awesome, instead we get something that's pretty crap compared to virtually everything else (even if it is also insanely cheap in terms of render time cost), NOW, despite it being a very nice riposte to AMD's MLAA when it came out.

Continued development is pretty nice for a given AA method. :v: Have you tried modern SMAA injection? They're quite efficient and highly configurable, worth a look if you're used to a certain crispness since they can offer some features that are similar to the end user as CSAA, and are much more agnostic with regard to what engines and rendering methods they'll work with versus not so much.

Shaocaholica
Oct 29, 2002

Fig. 5E
You can't be 'bottlenecked' by vram amount. You either have enough or you swap. Running out of vram is a lot worse than having slightly less gpu power.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

If you say so. Doesn't match my experience. And it's a bold thing to say that 780Ti owners will have "slightly less GPU power" than owners of cards made with, say, 4GB or 6GB or whatever in the future.

Unlucky7
Jul 11, 2006

Fallen Rib
I currently have a single GTX 560 Ti which does the job well for most games. However, I think it could be a bit more powerful. Will that overclock guide in the OP be good for it? Or should I start looking into whatever the current best value card is?

EDIT: A google search says no. Welp. In that case what card should I look into upgrading to?

Unlucky7 fucked around with this message at 04:08 on Jun 8, 2014

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Overclocking it is still pretty simple, for what it's worth. More simple, really. Overclocking my 650Ti is the same, and frankly kind of nice since I can directly control things instead of bargaining with an algorithmic approach that treats many instructions as suggestions. It's a 650Ti non-boost, so it behaves like every card from at least back to the Tesla cards does with regard to overclocking, same as yours.

Three directly controlled variables that matter - core frequency, memory frequency, voltage. Set a custom fan profile to keep it cool and see how high you can get each. Also, unfortunately your card is the one most impacted by the change from the 316 drivers to the next branch; the 560Ti had problems from launch, some games just do not like its logic for some reason (a huge issue at the time as I recall was Battlefield 3 having black polygons of quite a large size everywhere, and I'm not sure if they ever really properly worked that out), but moving to the newer drivers (which is pretty much necessary for newer game support, unfortunately) basically breaks the loving thing in many situations.

You might want to see about getting an affordable 650Ti Boost if you want to get a card that is slightly more powerful, quite affordable usually since it's not super current gen... Or you could move up to the next-gen tech and outdo it in performance with a 750Ti, get a taste of Maxwell before the rest of us :) If you did that, you could expect improvements on the order of these, since the 650Ti is fairly analogous to the 560Ti in overall performance, with better shader performance, generally speaking.



That said, the current best value card is for the bold - used R9 280x, 290s, or 290Xes, from disassembled mining rigs. See the last few pages and you'll see a trend of them getting unloaded by the bucket onto eBay. The 280x is very much like a 7970GHz with a few adjustments here and there, which would be a rather large leap in performance; the 290s/290x are AMD's current top of the line, next gen offering (though I won't be surprised to see a refresh to compete with high performance Maxwell cards when they enter the market). There may be some price pressure from all this on nVidia's lineup, at least when it comes to second-hand cards - new, the best bang for the buck is probably still the GTX 760, but the 280X kicks its rear end all over the place if you can find one in a similar price range thanks to bitcoin/altcoin folks experiencing the greatest tragedy of a generation or something idk.

Agreed fucked around with this message at 04:21 on Jun 8, 2014

beejay
Apr 7, 2002

The 760 definitely has some competition for "bang for the buck"-ness with the R9 270X being found for under $200 now and the R9 280 being around $250. All of them are really solid for 1080p gamers at the moment, it's nice.

BurritoJustice
Oct 9, 2012

I don't see the 280 as disruptive to the 760 really, more that it is an alternative if people prefer AMD to Nvidia. The 280 is a 7950, and is blow for blow with the 760 depending on the game, and the 270x is outright slower.

beejay
Apr 7, 2002

So... you agree with me? My point is that there is an alternative to the 760 at the same price point and there is also a capable alternative for quite a bit cheaper if you want to go that way.

BurritoJustice
Oct 9, 2012

Oh yeah totally. I misread your initial post as "there is a better price/performance card than the 760 in the form of the 280".

beejay
Apr 7, 2002

Nah. I think I'd recommend the 760 first to novice system builders just because you can pretty much grab whatever 760 and be totally fine. The AMD cards I dunno, I look at some of them and think, I don't really know much about that company and I can't really recommend them. I have always been fine with Sapphire but they scored fairly poorly on that French site's stats last year. AMD needs something like nvidia's Greenlight thing. That being said though, I definitely think AMD is worth consideration and competition is always a good thing.

Panty Saluter
Jan 17, 2004

Making learning fun!
So this just started happening. The colorful thing in the distance flickers and jumps around when in motion. So far I've only seen it in Skyrim but that's all I've been playing lately. Twice now I have seen the corruption, quit to desktop and had the computer freeze. The GPU is a Sapphire HD 7850 (not overclocked now, it had been before), which had an alarmingly high failure rate anyway. Looks like I got more time out of it than a lot of people, but this sort of artifact usually means the GPU is on its way out, ja? :sigh:



e: gently caress you radium, i dont care if the attachment even shows after this

Only registered members can see post attachments!

Panty Saluter fucked around with this message at 15:08 on Jun 8, 2014

Shaocaholica
Oct 29, 2002

Fig. 5E

Agreed posted:

If you say so. Doesn't match my experience. And it's a bold thing to say that 780Ti owners will have "slightly less GPU power" than owners of cards made with, say, 4GB or 6GB or whatever in the future.

Well I'll give you that there are certain very unplausible situations where the GPU will simply not be able to process a poo poo ton of data in vram based on GPU throughput and memory bandwidth. For instance if you have a 4GB card and whatever scene you're rendering happens to need to pull all 4GB of data to render just a single frame. Based on current top end cards, they all typically can't process the entirety of their vram within a reasonable allotted time (say 1/120s or 1/60s) which is being generous. However, I doubt that any game developer would design and code a game that needed to pull that much data per frame and you'd hit bandwidth barriers on the new consoles before you hit them on GPUs. At the end of the day, if a developer uses 4GB+ of vram on a console and can still hit decent framerates given the limitations, you're likely going to need as much if not more vram on the desktop unless you want the developers to scale back the art for desktop. Just like CPU memory, you either have it or you don't. There's no compensating for lack of vram or main memory with more memory bandwidth and/or faster CPU/GPU.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Fair points all. My main concern is what and why are assets resident in GPU rather than not? If it's just poor optimization, that's lovely and we're not getting anything nice out of that bargain; if it's because they've got something very clever going on, cool, time to start looking at higher VRAM cards.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply