Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
The Lord Bude
May 23, 2007

I'M DISAPPOINTED THAT CORTANA WILL BE A CIRCLE AND NOT THE ACTUAL SEXY WOMAN FROM THE GAME.


LooKMaN posted:

Does anyone have any experience with Gainward GeForce GTX 970 Phantom 4GB? I can get one for about 50 cheaper than other "cheap" 970s like Palit/KFA/Inno3d.

Their coolers aren't as good as the major OEMs - MSI/Asus etc - and the phantom cooler usually takes up 3 slots. On the other hand, the fans are designed to slide out for cleaning which is kinda neat.

Adbot
ADBOT LOVES YOU

1gnoirents
Jun 28, 2014

ASK ME ABOUT MY MICRO PENIS


I've never heard of gainward and I usually put a small premium on the cooling, however I'm talking $20. 50 pounds seems like it'd sway me towards it as long as it worked. Like I'd rather have a zotac for $330 versus a $400 Asus/msi/whatnot

Maybe this was mentioned and I missed it, but soon we can combine SLI/Crossfire memory ?

http://www.tweaktown.com/news/43347...ntle/index.html

1gnoirents fucked around with this message at 23:59 on Feb 24, 2015

Truga
May 4, 2014




Lipstick Apathy

http://www.tomshardware.com/news/mi...idia,28606.html

quote:

Exclusive: DirectX 12 Will Allow Multi-GPU Between GeForce And Radeon
I just saw this. Is this legit? I want to think it is because it's tom's hardware but...

Khagan
Aug 8, 2012

Words cannot describe just how terrible Vietnamese are.

I remember there being a controller/chip that could do this a couple of years ago, I think it was called Hydra and was definetely on an MSI board.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast


Truga posted:

http://www.tomshardware.com/news/mi...idia,28606.html

I just saw this. Is this legit? I want to think it is because it's tom's hardware but...

Not if NVIDIA has anything to do with it; they're downright petty when it comes to that, such as not allowing an NVIDIA card to run as a PhysX card if an AMD card is installed.

Khagan posted:

I remember there being a controller/chip that could do this a couple of years ago, I think it was called Hydra and was definetely on an MSI board.

Apparently it was an enormous chunk of fces, though. I do have a board that has LucidLogix Virtu (Asus P8Z68-V Pro). Same guys, but it just allows you to use the on-die graphics alongside a PCIe card.

HalloKitty fucked around with this message at 12:12 on Feb 25, 2015

Party Plane Jones
Jul 1, 2007

by Reene


Fun Shoe

HalloKitty posted:

Not if NVIDIA has anything to do with it; they're downright petty when it comes to that, such as not allowing an NVIDIA card to run as a PhysX card if an AMD card is installed.

Wasn't there some bullshit thing with AA in specific games (like Batman) being forced off if Ati cards were detected?

Truga
May 4, 2014




Lipstick Apathy

HalloKitty posted:

Not if NVIDIA has anything to do with it; they're downright petty when it comes to that, such as not allowing an NVIDIA card to run as a PhysX card if an AMD card is installed.

I was just going to google about this. They're still doing this then? I guess I'll keep buying ATi, until now I've never had issues with their cards/drivers people keep whining about, and they seem to not be giant shitlords.

E: Further reading into that article, apparently they want to treat multiple gpus as one single gpu with a lot of power to render every frame. That sounds all kinds of cool and awesome, but will there be a non-poo poo engine that does it in the next 10 years?

Party Plane Jones posted:

Wasn't there some bullshit thing with AA in specific games (like Batman) being forced off if Ati cards were detected?

Google says yes. The way it's meant to be played, surely.

Truga fucked around with this message at 12:30 on Feb 25, 2015

Grim Up North
Dec 12, 2011



The new JPR numbers for discrete graphic card market share are out, and it is looking bad for AMD. Here is a graphic compiling the last thirteen years of (only) ATI/AMD vs NVIDIA with key releases for reference:



I hope AMD can still innovate in GPU market and doesn't completely give up like they did with CPUs.

sauer kraut
Oct 2, 2004


That's a nice graph, Nvidias CEO isn't kidding when he says the FX 5000 series nearly sunk the company.
But then they made the GTX 8800. Several times.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast


sauer kraut posted:

That's a nice graph, Nvidias CEO isn't kidding when he says the FX 5000 series nearly sunk the company.
But then they made the GTX 8800. Several times.

But then, why were they having reasonable success with the GTX 4xx series? The 480 was an incredibly power-hungry card, but back then, nobody seemed to really care, I guess. The power use difference between a 480 and a 5870 is quite a bit larger than the 970 vs 290X.

HalloKitty fucked around with this message at 12:59 on Feb 25, 2015

NyxBiker
Sep 24, 2014


Party Plane Jones posted:

Wasn't there some bullshit thing with AA in specific games (like Batman) being forced off if Ati cards were detected?

I can confirm this was happening.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!


Based on that graph, AMD is already dead. It's not like they're going to recover with the 300 series, so basically the entire market will get captured by Nvidia, especially since any fabled 400 series would have to pull a complete rabbit out of a hat bullshit to be competitive. They slipped into a coma early 2010 and it looks like life support is getting cut late 2015.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast


FaustianQ posted:

Based on that graph, AMD is already dead. It's not like they're going to recover with the 300 series, so basically the entire market will get captured by Nvidia, especially since any fabled 400 series would have to pull a complete rabbit out of a hat bullshit to be competitive. They slipped into a coma early 2010 and it looks like life support is getting cut late 2015.

Aren't you just willing that to happen, though? Because you toxxed yourself based on it. There's nothing massively wrong with AMD GPUs. The 290 and 290X are somewhat power hungry, but the main reason for their decline is people constantly saying things like "oh, NVIDIA drivers are the best", and only recommending NVIDIA.

AMD CPUs on the other hand are a desperate cause.

HalloKitty fucked around with this message at 13:51 on Feb 25, 2015

BurritoJustice
Oct 9, 2012



FaustianQ posted:

Based on that graph, AMD is already dead. It's not like they're going to recover with the 300 series, so basically the entire market will get captured by Nvidia, especially since any fabled 400 series would have to pull a complete rabbit out of a hat bullshit to be competitive. They slipped into a coma early 2010 and it looks like life support is getting cut late 2015.

AMD has been "already dead" since they were founded. They'll keep existing, somehow.

Edit: "Somewhat power hungry" is a minor understatement. Really I think the difference is downplayed more than it has right to be.

BurritoJustice fucked around with this message at 13:52 on Feb 25, 2015

The Lord Bude
May 23, 2007

I'M DISAPPOINTED THAT CORTANA WILL BE A CIRCLE AND NOT THE ACTUAL SEXY WOMAN FROM THE GAME.


HalloKitty posted:

people constantly saying things like "oh, NVIDIA drivers are the best", and only recommending NVIDIA.

Some of us can vouch for that though. I switched to AMD during the gtx480 vs 5*** era and I had a fuckton more driver issues during the 2.5 years I owned a 5970 vs the now 6 years of Nvidia ownership I've had under my belt, despite owning a 4 way SLi setup at one stage (9800GX2s).

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast


BurritoJustice posted:

Edit: "Somewhat power hungry" is a minor understatement. Really I think the difference is downplayed more than it has right to be.


No way is it downplayed, it's mentioned constantly, and with the 970 and 980, rightfully so. But not everyone buys only the top end parts.

Must be only fair to mention the 480's performance/watt, then:

BurritoJustice
Oct 9, 2012



HalloKitty posted:

No way is it downplayed, it's mentioned constantly, and with the 970 and 980, rightfully so. But not everyone buys only the top end parts.

Must be only fair to mention the 480's performance/watt, then:



Maybe I've just been reading too much reddit lately . Any mention of someone preferring a 970 over a 290x for power usage reasons is yelled at with "heat isn't a problem with custom coolers", "it is an extra dollar per year on your bill" and "the difference is only 40w under load anyway". Which is a misunderstanding of thermodynamics, an often false assumption of being in america, and flat out wrong, in order.

If I ran a 290x where I lived instead of my 980 I wouldn't be able to use my computer with any sort of comfort.

1gnoirents
Jun 28, 2014

ASK ME ABOUT MY MICRO PENIS


how many people who justify buying AMD gpus because nvidia doesnt like to be compatible with AMD own apple products



















veedubfreak
Apr 2, 2005

by Smythe


1gnoirents posted:

how many people who justify buying AMD gpus because nvidia doesnt like to be compatible with AMD own apple products





















No one should own Apple products. Steve Jobs was a giant piece of poo poo and I'm glad he died of an easily curable cancer.

That being said, I have the opportunity to pick up 2 980 G1 cards and a 2nd block for 1000 bucks. Haven't decided if I really want to screw with SLI again or not.

(USER WAS PUT ON PROBATION FOR THIS POST)

Gwaihir
Dec 8, 2009



Hair Elf

2 G1s and blocks for 1000$ seems like quite a steal compared to individual pricing on all that.

Stanley Pain
Jun 16, 2001

Bit. Trip. RIP.


veedubfreak posted:

No one should own Apple products. Steve Jobs was a giant piece of poo poo and I'm glad he died of an easily curable cancer.

That being said, I have the opportunity to pick up 2 980 G1 cards and a 2nd block for 1000 bucks. Haven't decided if I really want to screw with SLI again or not.

Do it so I don't have to :3

Zero VGS
Aug 16, 2002
"It has gunfights and shit!"


Lipstick Apathy

You guys need to stand around in Microcenter and watch how little informed decision-making Joe Public can muster. Then you'll know why AMD cpus and gpus are never going away.

The main take away is a lot of people buy a PC based on price and absolutely nothing else, which even encourages OEMs to try to stick AMD stuff into their poo poo. I'm worried it is becoming a de-facto monopoly for PCs though, AMD will probably always have 10-20% market share until the end of eternity but they have to do better than that.

Stanley Pain
Jun 16, 2001

Bit. Trip. RIP.


Zero VGS posted:

You guys need to stand around in Microcenter and watch how little informed decision-making Joe Public can muster. Then you'll know why AMD cpus and gpus are never going away.

The main take away is a lot of people buy a PC based on price and absolutely nothing else, which even encourages OEMs to try to stick AMD stuff into their poo poo. I'm worried it is becoming a de-facto monopoly for PCs though, AMD will probably always have 10-20% market share until the end of eternity but they have to do better than that.


This is exactly IT. People just want that super cheap poo poo and gently caress you if you try to talk me out of getting that cheap poo poo.

Truga
May 4, 2014




Lipstick Apathy

I honestly don't have issues with ATi/AMD GPUs though? The first and last time was when I got my first ever job and treated myself to a brand spanking new 4870x2 fresh out of the factory, and quickly found out Gothic 3 didn't work well with it, some trees glitching, and AA not working. (all of which got patched in the first catalyst update ~2 weeks later, only to find out it's actually a bad game). Before and since that, though, basically no issues.

I'm not going to say anything about their CPUs because they're poo poo tier right now, and have been for a while, but GPUs seem decent to me? Am I just super lucky and playing the exactly right games or something?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast


Truga posted:

I'm not going to say anything about their CPUs because they're poo poo tier right now, and have been for a while, but GPUs seem decent to me? Am I just super lucky and playing the exactly right games or something?

No, their GPUs are basically fine, and a lot of people use them without issue.

1gnoirents
Jun 28, 2014

ASK ME ABOUT MY MICRO PENIS


Truga posted:

I honestly don't have issues with ATi/AMD GPUs though? The first and last time was when I got my first ever job and treated myself to a brand spanking new 4870x2 fresh out of the factory, and quickly found out Gothic 3 didn't work well with it, some trees glitching, and AA not working. (all of which got patched in the first catalyst update ~2 weeks later, only to find out it's actually a bad game). Before and since that, though, basically no issues.

I'm not going to say anything about their CPUs because they're poo poo tier right now, and have been for a while, but GPUs seem decent to me? Am I just super lucky and playing the exactly right games or something?

They are good. Its nothing like the CPU situation. It's just there is a lot of concern with this next generation, It's possible AMD as a whole may drag them down. I wouldn't look too far into the rest of the sniping its just the nvidia scandal back and forth (poo poo talking a little too much, probably white knighting a little too much in retaliation, lol).

My concern starting around last year was while AMD gpu's were pretty much great and competitive they did seem to have a power "problem"... very much like their cpu's. I'm not saying its related or its even a dealbreaker for most, but then it started to turn south further when AMD didn't have a response to Nvidia's release. Both companies had been riding on old silicon for years as well so, as with nvidia, I fully expected AMD to at least match that even if they dont touch the power side of things. But now somewhat troubling (to me) reports are that only the 390x will be new. So we might be buying 7970's and poo poo all over again ... something I owned like 4 years ago.

The 285 didn't bode well, despite being more of a features thing.

The 390x being factory water cooled is being touted as a feature when it only really indicates issues to me. And cost.

So combine that with the market share nvidia has pulled despite being, well nvidia, is this the writing on the wall? I dont know. It's not good though. I rather liked it when you could go either way and be just as well off for the money spent but I just dont get that feeling this time around. Time will tell I guess

Truga
May 4, 2014




Lipstick Apathy

From what I got it'll be like r9 2xx release. R9 390 and 390x are a new thing, 380/x are previous 290/x, etc.

But yeah, I hope they can get down to less than 28nm soon, this is getting silly. And the market share thing is definitely not good.

PC LOAD LETTER
May 23, 2005
WTF?!

Slippery Tilde

Their GPU process tech is dictated by TSMC. They, and nvidia, can't do anything about it short of finding someone else to fab their stuff for them.

Who else could they use though? Maybe Samsung, but their process seems to be all memory and ultra low power oriented? Maaaybe Intel, if they're willing to spend the cash and people were willing to pay $rape$? IBM is getting out of the fab business. GlobalFoundries still seems to be having yield issues and who knows what their 14nm process will be like and when its really coming.

I think AMD and nvidia are both stuck with TSMC for a very long time.

Rastor
Jun 2, 2001



OK but TSMC has other processes besides 28nm, why is every goddamn AMD chip still on 28nm process?

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>


^^^ Ask Nvidia that too. I think they've even complained about TSMC's 20nm yields, which is part of the reason they're still on 28nm as well.

PC LOAD LETTER posted:

I think AMD and nvidia are both stuck with TSMC for a very long time.

With regards to AMD:

http://www.guru3d.com/news-story/am...-foundries.html

veedubfreak
Apr 2, 2005

by Smythe


Rastor posted:

OK but TSMC has other processes besides 28nm, why is every goddamn AMD chip still on 28nm process?

TSMC only has good production on lower power chips, as in stuff that goes in laptops and phones. Anything that actually requires a decent amount of power is stuck at 28nm.

Mr SoupTeeth
Jan 16, 2015


BurritoJustice posted:

Maybe I've just been reading too much reddit lately . Any mention of someone preferring a 970 over a 290x for power usage reasons is yelled at with "heat isn't a problem with custom coolers", "it is an extra dollar per year on your bill" and "the difference is only 40w under load anyway". Which is a misunderstanding of thermodynamics, an often false assumption of being in america, and flat out wrong, in order.

If I ran a 290x where I lived instead of my 980 I wouldn't be able to use my computer with any sort of comfort.

Ditching a reference 760 for a Strix 970 dropped my GPU temps by over 20C and 5-10C everywhere else in the case. That could be the difference between full CPU speed and thermal throttling if using something like an OEM cooler or a case with less space/airflow. It pays off everywhere to keep the most power hungry component in the system as cool and low wattage as possible.

Mr SoupTeeth fucked around with this message at 17:57 on Feb 25, 2015

veedubfreak
Apr 2, 2005

by Smythe


Gwaihir posted:

2 G1s and blocks for 1000$ seems like quite a steal compared to individual pricing on all that.

It's 2 G1 cards, a single waterblock and the CSQ sli connector. I already have 1 block, so all I would have to do is return the card I have now to Microcenter and use the block I have. Either way 2 cards and a block is still a good deal I figure.

PC LOAD LETTER
May 23, 2005
WTF?!

Slippery Tilde

Rastor posted:

OK but TSMC has other processes besides 28nm, why is every goddamn AMD chip still on 28nm process?
Because GlobalFoundries has process problems + is ~2yr behind Intel in process development. Also TSMC's other processes either aren't ready or have all the early supply bought out by Apple.

I'd also expect more delays from TSMC with further process improvements. They've been continuously late by quite a bit for the last few years. Since the 90nm days at least.

Hace posted:

With regards to AMD:
I thought there was some confusion still if they were talking about their high performance GPU's or APU's? There was some similar talk and confusion years ago.

Rastor
Jun 2, 2001



PC LOAD LETTER posted:

Maybe Samsung, but their process seems to be all memory and ultra low power oriented?

veedubfreak posted:

TSMC only has good production on lower power chips, as in stuff that goes in laptops and phones. Anything that actually requires a decent amount of power is stuck at 28nm.

PC LOAD LETTER posted:

Also TSMC's other processes either aren't ready or have all the early supply bought out by Apple.

So it sounds like the reason we can't have nice(r) GPUs is: "because iPhones".

PC LOAD LETTER
May 23, 2005
WTF?!

Slippery Tilde

Not too surprising if you think about it. Fabs need to sell large amounts of product to make money. High performance GPU's like those used in PC's are relatively niche but millions of smartphones are sold every year. They have to focus their business on where the money is.

eggyolk
Nov 8, 2007



Rastor posted:

So it sounds like the reason we can't have nice(r) GPUs is: "because iPhones".

Isn't this the same reason we haven't had decent (non-Korean) 1440p monitors?

calusari
Apr 18, 2013

It's mechanical. Seems to come at regular intervals.

if anything we are getting them sooner given the massive r&d costs of moving to a smaller node

1gnoirents
Jun 28, 2014

ASK ME ABOUT MY MICRO PENIS


eggyolk posted:

Isn't this the same reason we haven't had decent (non-Korean) 1440p monitors?

This trend seems to finally be breaking this year, although I wonder how long 1440 will be popular when 4k has a real chance of being viable with dx12 from my understanding.

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014



Rastor posted:

OK but TSMC has other processes besides 28nm, why is every goddamn AMD chip still on 28nm process?

Because TSMC can't get their poo poo together. AMD and Nvidia were waffling between waiting on the 20nm process and just jumping a node.

Jumping a node is a non-trivial feat. You cannot just go into photoshop and use the resize tool to make your chip smaller.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply