Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!
Oh hey, just in case you guys wanted anecdotal evidence as to what happens when you short a video card's VRM (technically the capacitor bank feeding the VRM):

1. Nothing
2. You fry your motherboard's VRMs.
3. You fry the power traces for the PCIe slot your video card was sitting on.

ASK ME HOW I KNOW.
I shorted my video card's VRM not ONCE, NOT TWICE, BUT THREE TIMES before I realized I machined a piece of copper wrong.

And my VRM1 temps are still retarded high :eng99:

deimos fucked around with this message at 01:13 on May 19, 2014

Adbot
ADBOT LOVES YOU

luigionlsd
Jan 9, 2006

i dont know what this is i think its some kind of nazi giraffe or nazi mountains or something i dont know
Not sure if this belongs in the part picking thread, but what's the word on reference vs. third party coolers for GeForce 780? Only considering EVGA, and price is about 20 max between choices.

As classy as the reference design is, I have a feeling the ACX is going to be more efficient.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

deimos posted:

Oh hey, just in case you guys wanted anecdotal evidence as to what happens when you short a video card's VRM (technically the capacitor bank feeding the VRM):

What exactly were you trying to accomplish by doing that?

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

deimos posted:

Oh hey, just in case you guys wanted anecdotal evidence as to what happens when you short a video card's VRM (technically the capacitor bank feeding the VRM):

1. Nothing
2. You fry your motherboard's VRMs.
3. You fry the power traces for the PCIe slot your video card was sitting on.

ASK ME HOW I KNOW.
I shorted my video card's VRM not ONCE, NOT TWICE, BUT THREE TIMES before I realized I machined a piece of copper wrong.

And my VRM1 temps are still retarded high :eng99:

And look me over here afraid to even overvolt my GTX 770 by the piddling amount nVidia allows in its software.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

DrDork posted:

What exactly were you trying to accomplish by doing that?

Check out the spiffy spoiler. I am trying to lower my VRM1 temps and machined some copper wrong (it was a milimiter too wide) and I didn't realize it was making contact until the third time I took it apart (I thought it was issues with the thermal padding getting pierced by a resistor).

Next plan: weld a heat pipe to a copper bar to put across those motherfuckers.

Beautiful Ninja posted:

And look me over here afraid to even overvolt my GTX 770 by the piddling amount nVidia allows in its software.

Eh, I knew I was potentially blowing up everything, at least I know how to fix most of it (yay replacing blown VRMs on motherboards).

deimos fucked around with this message at 01:24 on May 19, 2014

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

luigionlsd posted:

Not sure if this belongs in the part picking thread, but what's the word on reference vs. third party coolers for GeForce 780? Only considering EVGA, and price is about 20 max between choices.

As classy as the reference design is, I have a feeling the ACX is going to be more efficient.

The ACX cooler is significantly quieter and significantly cooler, although the MSI cooler is significantly quieter again for pretty much the same cooling. (The difference between MSI and EVGA is larger than EVGA and reference IIRC.)

EVGA would be the second best option however and if you absolutely must have EVGA for some reason you shouldn't feel bad about it at all.

The Lord Bude fucked around with this message at 05:33 on May 19, 2014

SlayVus
Jul 10, 2009
Grimey Drawer
Anyone know what causes frame freezing in Shadow Play videos? A couple months ago, my videos of gameplay would still play audio, but the video would get all garbled up thrn continue playing a few seconds later.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

The Lord Bude posted:

The ACX cooler is significantly quieter and significantly cooler, although the MSI cooler is significantly quieter again for pretty much the same cooling. (The difference between MSI and EVGA is larger than EVGA and reference IIRC.)

EVGA would be the second best option however and if you absolutely must have EVGA for some reason you shouldn't feel bad about it at all.

EVGA originally lead the pack in both cooling perf and noise on GK110 chips (the ACX setup didn't excel as well with the GK104 designs, however, other companies had them beat on that out the gate for lowest noise and highest performance). Then all the other companies redid their cooling setups slightly and from all I can tell, EVGA was like "you know what's already cool? ACX, we ain't fuckin' with it :rock:" and so now they sit in second or third place depending on your preferred metrics - but they're all so much better than stock that it's barely a thing, and they also happen to distinguish themselves by producing a genuinely 2-slot solution, completely so, with no overage AT ALL into what would be a third slot. That can be helpful if you're doing stuff other than graphics graphics graphics shiny shiny shiny and need the PCI-e space, but it'd be a bad idea to try to run multiple 780/780Ti ACX in SLI as that just is not enough room for the fans to do their thing. It can run fine with a smaller normal PCI-e card of some sort adjacent, especially if (like most enthusiasts still using air, like me, heh) you've got a big, powerful fan just bathing the fuckin' thing in cool air, but... Blowers still win for multiple adjacent card SLI/Xfire, since robbing the general area of air to exhaust outside your case is their speciality, and so it shall be until either you go liquid or some next-gen cooling tech is invented.

Agreed fucked around with this message at 10:44 on May 19, 2014

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Agreed posted:

EVGA originally lead the pack in both cooling perf and noise on GK110 chips (the ACX setup didn't excel as well with the GK104 designs, however, other companies had them beat on that out the gate for lowest noise and highest performance). Then all the other companies redid their cooling setups slightly and from all I can tell, EVGA was like "you know what's already cool? ACX, we ain't fuckin' with it :rock:" and so now they sit in second or third place depending on your preferred metrics - but they're all so much better than stock that it's barely a thing, and they also happen to distinguish themselves by producing a genuinely 2-slot solution, completely so, with no overage AT ALL into what would be a third slot. That can be helpful if you're doing stuff other than graphics graphics graphics shiny shiny shiny and need the PCI-e space, but it'd be a bad idea to try to run multiple 780/780Ti ACX in SLI as that just is not enough room for the fans to do their thing. It can run fine with a smaller normal PCI-e card of some sort adjacent, especially if (like most enthusiasts still using air, like me, heh) you've got a big, powerful fan just bathing the fuckin' thing in cool air, but... Blowers still win for multiple adjacent card SLI/Xfire, since robbing the general area of air to exhaust outside your case is their speciality, and so it shall be until either you go liquid or some next-gen cooling tech is invented.

Isn't the MSI cooler also exactly dual slot? It fits fine in my prodigy, and I would have thought it wouldn't if it was wider.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

The Lord Bude posted:

Isn't the MSI cooler also exactly dual slot? It fits fine in my prodigy, and I would have thought it wouldn't if it was wider.

Don't think it used to be, but I'm not exactly shuffling cards, already have the big one, you know? I know MSI and Asus both redid their coolers profoundly - I think it's Asus' that uses a hybrid blower/standard fan setup, which is just too cool as a concept and I'd love to try that thing but I'm not going to pay for the privilege so nooope - it's possible that EVGA is basically just sitting on their card as being "totally good enough" as the original setter of standards and other companies feel compelled to tick the same boxes as time goes on. That's how it works, ideally, anyway, right? For consumers, we've got the best of both worlds - they have to all be at least as good as the finished reference model, so you can count on it Not Sucking, and yet there's plenty of room to play around in terms of competition once you've hit the bare minimum threshold of Not Sucking.

I wonder if AMD has any plans to do something like that. Plenty of companies have standardized to at least this degree now, so why not them? Get those Sapphire failure figures back in line, heh.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Agreed posted:

Don't think it used to be, but I'm not exactly shuffling cards, already have the big one, you know? I know MSI and Asus both redid their coolers profoundly - I think it's Asus' that uses a hybrid blower/standard fan setup, which is just too cool as a concept and I'd love to try that thing but I'm not going to pay for the privilege so nooope - it's possible that EVGA is basically just sitting on their card as being "totally good enough" as the original setter of standards and other companies feel compelled to tick the same boxes as time goes on. That's how it works, ideally, anyway, right? For consumers, we've got the best of both worlds - they have to all be at least as good as the finished reference model, so you can count on it Not Sucking, and yet there's plenty of room to play around in terms of competition once you've hit the bare minimum threshold of Not Sucking.

I wonder if AMD has any plans to do something like that. Plenty of companies have standardized to at least this degree now, so why not them? Get those Sapphire failure figures back in line, heh.

Yes, the Asus is the hybrid design but from what I've read it has the worst cooling of the major brands and is so noisy that it only beats the reference 780ti cooler by a decibel or so.

BurritoJustice
Oct 9, 2012

The MSI is perfectly dual slot, with no extra size as with the ACX. The Asus is dual slot in the normal sense, but it has a backplate that sticks out a bunch and can cause problems with the slot behind it (I had problems with it on a build with a mATX motherboard where the backplate got in the way of cooler fan clips, even on an Extreme4 which has some of the most spacious CPU cooler area). The gigabyte has similar problems with length, as the cooler extends beyond the PCB. The galaxy HOF cooler is a fantastic performer, but it is out of spec in every direction, with its long 2.5 slot cooler and backplate, and even a higher PCB. The classy takes the strange step of going for a ridiculously high PCB to fit in all the extras, but that is the direction that causes the least problems. The lightning cards are 2.5 slot coolers, and can cause problems backwards unless you remove the addon PCB off the back (only on the 780).

I think that covers them all.

Edit: also bude is right about the Asus cooler being mediocre, I only built with one as my friend selected it due to his Asus hardon (I recommended the MSI).

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I guess that's what you get with a long life cycle for a product generation, more and more variation as time goes by. It sucks that Asus' hybrid design doesn't work so well in practice but I still like the idea of it, maybe a better implementation could do more with it. I dunno. Did Gigabyte ever update their Windforce coolers? They were another one that came out strong early, so I'm not sure if they've actually needed to in terms of single-card perf provided space is not an issue. I seem to remember that it was the first card to really stand up to the ACX cooler (which really was the poo poo when it came out, and a smart move on EVGA's part, since there is no way in hell it costs more to make that largely plastic cooler with nickel-plated heatpipes compared to the insane, very very made of metal GK110 reference cooler design).

Hopefully this playground of a generation has given everyone involved some cool ideas about how to maximize their cooling efficiency or efficacy (hey, some don't mind more noise if it means more perf) for the next generation of products. Maxwell should have some very nice cooling options when it finally hits.

I actually do technically use two cards, both EVGA, both open air coolers, but the 650Ti SC generates very little heat and they're spaced several PCI-e slots apart with nothing in between them to get in the way so there's more than sufficient airflow. I may upgrade from the 200mm fan I have on the side blasting them to its 230mm big brother that came out pretty much right after I bought it, d'oooh. My airflow setup remains weird, but it's also effective, keeping the cards at room temperature while idle and not letting either of them get remotely hot even when playing unoptimized, uncapped, and thus demanding games. It's kinda funny to me that out of all the games that can stress the card, the game that is actually capable of getting it into the mid 50ºC range is the Defense Grid 2 beta - strictly due to a severe lack of optimization, though it is at least running the DX10 logic on the card which out of DX9-DX11 seems to be the hottest one on Kepler, going not just by this but by other testing in other games. DX10 is just a tad bit weird, I know DX11 is the focus of optimizations and all that but feels weird to see the middle step that really didn't get utilized by many games being the one that can take a very cool-running setup and push it harder than DX9 with hacked in very pretty stuff (The Witcher 2, especially) or DX11 games too numerous to mention. Crysis 3 doesn't have poo poo on the DG2 beta when it comes to working out my card, right now anyway. Optimizations will come and change all of that, I'm fairly certain of that much, haha.

Ignoarints
Nov 26, 2010
Maybe the hybrid design works better with lower workloads because it was by far the best cooler for the 660ti's I tried (completely beating MSI twin frozers). At full load I could barely get over 50 degrees and 30% fan speed... with bios modded 1.21v running at 1300+ mhz. Stock bios, no clock overclock, on a pair of MSI's I could achieve nearly the same temperatures but at twice the fan speed and it was wayyyy louder.

But I've seen many references that it is not as good for 770's and up.

Monday_
Feb 18, 2006

Worked-up silent dork without sex ability seeks oblivion and demise.
The Great Twist
Have AMD worked out the non-Crossfire microstuttering problem with the R9 series? I get it pretty bad with just a single 7850. I'm thinking about a new GPU and those used 290s are tempting at the same price as a 770, but I'd rather go with nVidia if the stuttering issues still exist. I know it's supposedly pretty rare with just one card but it's really noticeable to me. Maybe I just have a bad card.

Ignoarints
Nov 26, 2010

MondayHotDog posted:

Have AMD worked out the non-Crossfire microstuttering problem with the R9 series? I get it pretty bad with just a single 7850. I'm thinking about a new GPU and those used 290s are tempting at the same price as a 770, but I'd rather go with nVidia if the stuttering issues still exist. I know it's supposedly pretty rare with just one card but it's really noticeable to me. Maybe I just have a bad card.

Anecdotal and not even specific experience with AMD, but I find that pretty much all cards will experience this once you push them to their limits, especially with memory intensive things. SLI and crossfire will exasperate it, but it is separate from the micro stuttering specific issues from SLI/xfire although they look similar.

So anyways, I wouldn't buy nvidia over AMD just for that. If you're willing to get a used 290, there is almost zero reason not to at those prices. (and from what I've heard, you can do a bit better than a 770 price point but perhaps that dried up)

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

MondayHotDog posted:

Have AMD worked out the non-Crossfire microstuttering problem with the R9 series? I get it pretty bad with just a single 7850. I'm thinking about a new GPU and those used 290s are tempting at the same price as a 770, but I'd rather go with nVidia if the stuttering issues still exist. I know it's supposedly pretty rare with just one card but it's really noticeable to me. Maybe I just have a bad card.

My understanding is that they've focused pretty exclusively on making sure that their cards going forward (so R9 290, R9 290x, and the dually card) shouldn't suffer microstuttering in a measurable or visible way. It's just a known problem that the Tahiti and back cards still have that issue, though it only really pops up in a super obnoxious way when you're in one way or another stressing the product's (or products', for xfire) specifications, playing close to or exceeding their limit for smooth gameplay per se.

If you think back through the issue, what you get is nVidia knew about the problem in DirectX but kept it to themselves, so you don't really get microstuttering to anywhere near the same degree with SLI and it's virtually nonexistent with a single card. Enlightened self interest (also known as "hey AMD gently caress you lol") moved them to release their framerate analysis tool, which proved that AMD's method - a method, by the way, that SHOULD have been fine, because it was essentially just trusting the DirectX specification - was insufficient and caused microstuttering in xfire like crazy, but also had issues just in general rendering. They moved fast as they possibly could to address it, but they had to look forward rather than backward because they can't afford to spend their driver team's limited resources trying to fix older tech's problems... Hence the various categories of fixes, with only the R9 290/290x getting the full array of them and everything before that getting limited or no support.

Remember that ultimately it goes back to "Wow, DirectX was lovely at this and the specification didn't work like it was supposed to," not "Man, nVidia is way better than AMD at coding drivers!" - AMD just assumed that the spec should work, because why in the hell wouldn't it? (the answer is because DirectX kinda blows, which is something people working on rendering have known for a long time but something that everybody else has found out since :v:).

There's an added layer of hilarity to the whole debacle when you look at that brief period where there was an official statement by Microsoft that DirectX was essentially "Feature Complete" and everyone went "what in the gently caress are you talking about," with the ensuing clarifications that no really they meant, uh, Direct3D was, uh, no wait, NEVERMIND SORRY. Whole thing was a mess, and remains a mess. nVidia miraculously wrings up to 70%+ perf improvement with some driver optimizations in response to Mantle, Microsoft is still working on more DirectX stuff, I don't know WHAT the gently caress is happening with WDDM (it can go to hell, for hate's sake I spit my last breath at thee WDDM), and everybody keeps onnnn truckin'.

Grim Up North
Dec 12, 2011

Here's a fun link I haven't seen posted on here:

http://richg42.blogspot.de/2014/05/the-truth-on-opengl-driver-quality.html

It's Rich Geldreich's impression of various OpenGL drivers.

quote:

Vendor A
[...]
Even so, until Source1 was ported to Linux and Valve devs totally held the hands of this driver's devs they couldn't even update a buffer (via a Map or BufferSubData) the D3D9/11-style way without it constantly stalling the pipeline. We're talking "driver perf 101" stuff here, so it's not without its historical faults. Also, when you hit a bug in this driver it tends to just fall flat on its face and either crash the GPU or (on Windows) TDR your system. Still, it's a very reliable/solid driver.

quote:

Vendor B
A complete hodgepodge, inconsistent performance, very buggy, inconsistent regression testing, dysfunctional driver threading that is completely outside of the dev's official control.
[...]
Vendor B driver's key extensions just don't work. They are play or paper extensions, put in there to pad resumes and show progress to managers. Major GL developers never use these extensions because they don't work. But they sound good on paper and show progress. Vendor B's extensions are a perfect demonstration of why GL extensions suck in practice.

quote:

Vendor C
It's hard to ever genuinely get angry at Vendor C. They don't really want to do graphics, it's really just a distraction from their historically core business, but the trend is to integrate everything onto one die and they have plenty of die space to spare.
[...]
These folks actually have so much money and their org charts are so deep and wide they can afford two entirely different driver teams! (That's right - for this vendor, on one platform you get GL driver #1, and another you get GL driver #2, and they are completely different codebases and teams.)

:allears:

Ignoarints
Nov 26, 2010
Is anyone expecting Mantle to take off? Older crossfire simply sucks, and frankly r9 290+ crossfire isn't like the greatest thing either on paper even thoguh much improved visually. But when mantle is involved, things seem to change although there is less info on this. However since nvidia responded pretty quickly with similar improvements using directx, is that going to be too damaging to mantle for developers to code for it?

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



BurritoJustice posted:

The lightning cards are 2.5 slot coolers, and can cause problems backwards unless you remove the addon PCB off the back (only on the 780).
Yeah, the "power module" PCB MSI has on the Lightning apparently will push on the fan of another Lighting if in SLI due to the cooler being ~ 2.5 slots.

I'm crazy and have been thinking about SLI'ing two 780 Lightnings, and have been thinking about how to get more airflow into there actually. I'd be removing the lower card's addon PCB, and my case (Air 540) conveniently has the lower intake fan in the front positioned so that it blows air directly onto/between where the two cards would be, so that would hopefully provide the desired airflow. I have a Prolimatech USV-14 there currently, but just in case I would need more airflow I have a SpotCool waiting also. Ugh at SLI'ing these cards...

(My alternative approach is to just get a reference cooler and put it as the top card, but I'd have to go with the EVGA Superclocked version of the stock cooler in order to knock bring down the Lightning's stock/boost clock that much...

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Ignoarints posted:

Maybe the hybrid design works better with lower workloads because it was by far the best cooler for the 660ti's I tried (completely beating MSI twin frozers). At full load I could barely get over 50 degrees and 30% fan speed... with bios modded 1.21v running at 1300+ mhz. Stock bios, no clock overclock, on a pair of MSI's I could achieve nearly the same temperatures but at twice the fan speed and it was wayyyy louder.

But I've seen many references that it is not as good for 770's and up.

Well the 660ti is really old now. Maybe everyone else just put in a ton of effort and got better while Asus stagnated.

Ignoarints
Nov 26, 2010

The Lord Bude posted:

Well the 660ti is really old now. Maybe everyone else just put in a ton of effort and got better while Asus stagnated.

True. I did just look up a 760 out of curiosity since its very similar in performance and the only noise comparison between different brands I found did show it getting beating by MSI at full load. After looking at pictures, they did increase the fan size on the MSI, while the ASUS actually made the overall cooler smaller (it was a smaller PCB too). I dunno.

At any rate they haven't kept up for whatever reason. But the 660ti asus was amazingly quiet, just downright silent. A shame they aren't anymore even for a similar card.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Grim Up North posted:

Here's a fun link I haven't seen posted on here:

http://richg42.blogspot.de/2014/05/the-truth-on-opengl-driver-quality.html

It's Rich Geldreich's impression of various OpenGL drivers.


:allears:

What a lost opportunity to do Vendor A(MD) but noo...
A = nV
B = AMD
C = Intel?

At least I think with the gDEBugger reference for B.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party
A is NV, B is Intel, C1 is Intel Linux, and C2 is Intel Windows.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Professor Science posted:

A is NV, B is Intel, C1 is Intel Linux, and C2 is Intel Windows.

Wait why would Intel be Three different vendors.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Professor Science posted:

A is NV, B is Intel, C1 is Intel Linux, and C2 is Intel Windows.

What in the gently caress did you ingest before making this post

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party
oops, B is AMD. (I wish I ingested something, maybe I'm just getting sick)

Arzachel
May 12, 2012

MondayHotDog posted:

Have AMD worked out the non-Crossfire microstuttering problem with the R9 series? I get it pretty bad with just a single 7850. I'm thinking about a new GPU and those used 290s are tempting at the same price as a 770, but I'd rather go with nVidia if the stuttering issues still exist. I know it's supposedly pretty rare with just one card but it's really noticeable to me. Maybe I just have a bad card.

Single card microstutter shouldn't be a thing since last May/June betas iirc. If you're on semi-recent drivers, it's likely a different part of the system or bad memory on the card.

Arzachel fucked around with this message at 13:00 on May 20, 2014

veedubfreak
Apr 2, 2005

by Smythe
I have no stutter of any sort on my 290s both in single and xfire applications. My guess is that you are just running into issues where the card is running out of oomph.

Shaocaholica
Oct 29, 2002

Fig. 5E
Just reading up on a few new tech stuff:

quote:

3D Memory: Stacks DRAM chips into dense modules with wide interfaces, and brings them inside the same package as the GPU. This lets GPUs get data from memory more quickly – boosting throughput and efficiency – allowing us to build more compact GPUs that put more power into smaller devices. The result: several times greater bandwidth, more than twice the memory capacity and quadrupled energy efficiency

Unified Memory: This will make building applications that take advantage of what both GPUs and CPUs can do quicker and easier by allowing the CPU to access the GPU’s memory, and the GPU to access the CPU’s memory, so developers don’t have to allocate resources between the two.

NVLink: Today’s computers are constrained by the speed at which data can move between the CPU and GPU. NVLink puts a fatter pipe between the CPU and GPU, allowing data to flow at more than 80GB per second, compared to the 16GB per second available now

Pascal Module: NVIDIA has designed a module to house Pascal GPUs with NVLink. At one-third the size of the standard boards used today, they’ll put the power of GPUs into more compact form factors than ever before.

Do any of these technologies really mean anything to consumer and small business use of GPUs? Maybe future consoles? Stuff like gaming, image processing, video processing and small studio scale GPU rendering (oh and doge coin mining).

Seems like the last 2 will require non backwards compatible CPU and mainboard hardware changes that are unlikely for the consumer space in the near term even though I think current standards are a bit antiquated but that's another discussion.

Shaocaholica fucked around with this message at 19:39 on May 20, 2014

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Arzachel posted:

Single card microstutter shouldn't be a thing since last May/June betas iirc. If you're on semi-recent drivers, it's likely a different part of the system or bad memory on the card.
Micro-stutter is only fixed for Direct3D 10/11 applications. There have been no stutter fixes for OpenGL or Direct3D 9 titles, these will come in Phase 3 for which no schedule has been announced.

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

Shaocaholica posted:

Do any of these technologies really mean anything to consumer and small business use of GPUs? Maybe future consoles? Stuff like gaming, image processing, video processing and small studio scale GPU rendering (oh and doge coin mining).

3D Memory is probably the only thing that we'll see on the consumer side since it doesn't require any unique external hardware or software changes. This is also the one that would provide the greatest direct benefit to gaming since increasing the memory bandwidth will allow for larger textures and improvements at higher resolutions.

The other 3 items are aimed more at GPU compute. Unified memory will allow developers to stop worrying about making sure the data being used is in the correct CPU/GPU memory space and just let the calculations happen wherever. In order for this to really be useful though they'll need to drastically increase the speed at which the CPU/GPU can communicate since PCIe isn't really cutting it, bringing us to...NVLink. This is just a custom connector used to connect the GPU to the system that allows for more bandwidth than PCIe. The Pascal Module is just a version of a graphics card that uses NVLInk instead of PCIe and allows for a smaller form factor.

Shaocaholica
Oct 29, 2002

Fig. 5E
I'm interested in NVLink and the 'Pascal' module on their physical properties deviating from the traditional card A in slot B. Also the potential for cooling solutions that differ from the current 2 slot form factor and also power routing that's more elegant than a crap ton of wires from the PSU. But I know that's not going to drive those standards into the consumer space. Just interesting to think about and see physically manifested rather than internet ramblings(from me) about how much I want to move on from 'legacy' form factors.

Rastor
Jun 2, 2001

Shaocaholica posted:

I'm interested in NVLink and the 'Pascal' module on their physical properties deviating from the traditional card A in slot B. Also the potential for cooling solutions that differ from the current 2 slot form factor and also power routing that's more elegant than a crap ton of wires from the PSU. But I know that's not going to drive those standards into the consumer space. Just interesting to think about and see physically manifested rather than internet ramblings(from me) about how much I want to move on from 'legacy' form factors.

Anandtech did a little write-up about what the heck Pascal / NVLink are. Interesting stuff but as you say, probably not going to hit the consumer space for a little while.

Shaocaholica
Oct 29, 2002

Fig. 5E
^^^ Thanks ^^^

Again, not consumer space yet but these are interesting points IMO because I've griped on them before:

Anandtech posted:

At the same time the connector will be designed to provide far more than the 75W PCIe is spec’d for today, allowing the GPU to be directly powered via the connector, as opposed to requiring external PCIe power cables that clutter up designs.

Anandtech posted:

Besides reducing trace lengths, this has the added benefit of allowing such GPUs to be cooled with CPU-style cooling methods

Shaocaholica
Oct 29, 2002

Fig. 5E
Come to think of it, most people's PCs only have 1 type of card thats perpendicular to the mainboard and thats the GPU. If the GPU were parallel with the mainboard then you don't need towers per say. You could have -slim- desktops that could fit under your monitor stand and heatsinks for the CPU/GPU could be wider and shorter and mounted in the same direction as gravity vs perpendicular to it. Airflow could be easily routed in a uniform direction like a rackmount but with more flexibility in z-height. That would be pretty neat.

Curious if NVLink will make it into workstation class designs.

Arzachel
May 12, 2012

Alereon posted:

Micro-stutter is only fixed for Direct3D 10/11 applications. There have been no stutter fixes for OpenGL or Direct3D 9 titles, these will come in Phase 3 for which no schedule has been announced.

That's for Pre-Hawaii Crossfire issues. MondayHotDog was asking about single GPU microstutter specifically.

Arzachel fucked around with this message at 22:35 on May 20, 2014

Strategy
Jul 1, 2002
Just scored 2 MSI TwinFrozr R9 290s still under warranty for $550, thanks crypto-currency mining.

Josh Lyman
May 24, 2009


Strategy posted:

Just scored 2 MSI TwinFrozr R9 290s still under warranty for $550, thanks crypto-currency mining.
Must resist temptation to upgrade from 760 :negative:.

Also, I don't think warranty works without the original purchase receipt.

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Strategy posted:

Just scored 2 MSI TwinFrozr R9 290s still under warranty for $550, thanks crypto-currency mining.
There are a lot of great deals on 290's right now. Also, for anyone looking in that direction, Amazon has about a dozen used Gigabyte Windforce 290X's for $350-$360/ea. I know there's been talk of those particular boards having RAM issues, but they roll with a 3 year S/N-based warranty, so they're really a very good deal.

The ones to be semi-aware of are the Sapphire Tri-X's. On the upside, they've got great build quality, good OC potential, and are some of (if not THE) quietest on the market. Downside is Sapphire is kinda a dick on warranty support, so not only do you only get a 2 year warranty, but you need the original invoice to get service--which makes it harder to recommend picking one up used.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply