Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Nephilm
Jun 11, 2009

by Lowtax

Ignoarints posted:

I am all for openness and cross platform compatibility. I really, really, really am. But no matter what angle AMD is pushing now (and they are) they'd do the same exact things that nvidia is doing if they meant more money for them. Right now, its working in their favor to do the opposite of what nvidia is doing, if anything else just for the PR (and I know its more than just this). This is just the result of true competition and it will never change.

AMD has traditionally supported open standards much more than their competitors (inte/nvidia). You can go ahead and try to argue it's because they've always been behind, but the fact doesn't change.

Actually, I'm not even sure who you're trying to argue against here.

Ignoarints posted:

I kind of wish gsync wasn't doomed. Because I have an expensive nvidia card. If I had AMD, I'd hope that freesync takes off, but not because it's open to everyone. That's just an illusion I'm well aware could have been very different under different circumstances. If nvidia backed themselves in a corner with gsync, and freesync is putting the pressure on, the fact that its open (I mean, come on with the name freesync) is simply a characteristic of the concept with the sole goal of competing with the concept nvidia is trying to pull off.

This doesn't matter unless you were rushing to get an expensive g-sync display. Adaptive sync being added to the DP standard actually allows for a rather quick rollout, but g-sync being what it is, chances are you'd have already bought a new videocard before there were monitors worth a drat that included it.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I posit that nVidia will be pressured heavily to adhere to the standard or else be denied. I don't care that V-Sync meets an early grave in the process, it's bullshit that this hasn't been possible before and however it makes it to market is fine by me, same as getting rid of absolutely ridiculous draw call bottlenecks (WDDM, loving poo poo up for everyone trying to push video and audio in non-"Hey I'm watching a DVD with Windows Media Player! :haw:" ways since 2004). As consumers, we're benefiting from them actually slugging it out and nVidia having made a stupid move while being severely pressured in the SoC and HPC markets at the same time. Their big loving bags of money are worth a little less in a straight up slug-fest between pure, unadulterated, desperate "you people focus on the GPUs, make them good so that we can continue to win there for sure."

They've made some clever moves but, like AMD, they are ALSO going up against Intel right now, and Intel is a motherfucker with laser eyes and money that makes Scrooge McDuck prematurely ejaculate. This little graphics card division is kind of a distraction for both of them, albeit an important one for prestige and some attention generated thereof. But shareholders for nVidia seriously give like half of a gently caress about most of this stuff, it's just us little folks who like big cards and fast computers and shiny graphics that benefit from what is an example of really slick competition between two companies who are dancing very well - as fast as they can, but still, very well.

Ignoarints
Nov 26, 2010

Nephilm posted:

AMD has traditionally supported open standards much more than their competitors (inte/nvidia). You can go ahead and try to argue it's because they've always been behind, but the fact doesn't change.

Actually, I'm not even sure who you're trying to argue against here.


This doesn't matter unless you were rushing to get an expensive g-sync display. Adaptive sync being added to the DP standard actually allows for a rather quick rollout, but g-sync being what it is, chances are you'd have already bought a new videocard before there were monitors worth a drat that included it.

Nobody, sorry, I've just gotten unusually fed up with the unprecedented rage (literally coming from a single watch dogs review) that I'm hearing from so many people, in real life, that almost don't even play games at all. This is just not unusual or weird in any way to me at all, none of this, gameworks, etc. I cannot think of another competitive industry that doesn't try to do the exact same sort of things. Not saying that it works out but they try. I think I'm just feeling that sentiment from some people, influencing things they say is all.

edit: For instance, I had to sit and nod through a co worker telling me how gsync was proof that nvidia is evil. But because AMD was doing freesync, they were consumer's best hope. I'm positive he doesn't even know what gsync does except that he can't have it. He went on to say how Mantle is a prime example because it's free for nvidia to use too (I seriously don't even know how he knows this poo poo) and I couldn't help but make the mistake of saying something like "But mantle only works with amd drivers. And is the API even released yet?" (they arent). Oh boy.

That's the worst example but I've heard grumbling from friends who dont even own a PC.

I.e watch dogs runs like poo poo on their crappy PC's, Xboxs, and PS4s. One article blaming gameworks, and nvidia is why watch dog sucks and the world is ending

Ignoarints fucked around with this message at 03:30 on Jun 5, 2014

beejay
Apr 7, 2002

That's why it's best to not become attached to things like corporations and video games. :) If you have dumb fanboy friends spouting nonsense then just nod and smile and let it go. Do your own research and know that you are doing the best you can for yourself. Play video games that are fun and don't play ones that aren't fun. That's my life advice re: computer parts and video games.

Taco Duck
Feb 18, 2011


TSMC is beginning volume production of 20nm for Nvidia and AMD.


Maybe there will be a high-end Maxwell card by the end of the year?

http://wccftech.com/tsmc-begins-20nm-volume-production-gpus-node/

SCheeseman
Apr 23, 2003

My 280X from that bargain basement bitcoin hardware seller on ebay arrived. In perfect condition, no dust, looks virtually unused.

And while I was installing it, I broke the SATA connector on my SSD off (that little plastic thingy) :suicide: I tried to solder the cable directly to the intact pins but just ended up burning my fingers. I'm really terrible at using an iron.

It's an old Corsair Force 3 120GB anyway. I'm gonna pick up an EVO 250GB today to replace it. Still really annoying :(

Shaocaholica
Oct 29, 2002

Fig. 5E

SwissCM posted:

My 280X from that bargain basement bitcoin hardware seller on ebay arrived. In perfect condition, no dust, looks virtually unused.

And while I was installing it, I broke the SATA connector on my SSD off (that little plastic thingy) :suicide: I tried to solder the cable directly to the intact pins but just ended up burning my fingers. I'm really terrible at using an iron.

It's an old Corsair Force 3 120GB anyway. I'm gonna pick up an EVO 250GB today to replace it. Still really annoying :(

Pics plz

SCheeseman
Apr 23, 2003


Of the card or my poor, broken SSD?

The video card looks like a brand new Sapphire DualX R9 280X 3GB. There isn't a blemish on it.

NJD2005
Sep 3, 2006
...
If anyone wants some cheap r9 290's, someone put 10 up on ebay for $230 + $5 shipping each.
http://www.ebay.com/itm/VisionTek-AMD-Radeon-R9-290-4GB-GDDR5-Video-Card-/281353828514?pt=PCC_Video_TV_Cards&hash=item4181feb8a2

Couldn't find much info about VisionTek and their warranty only covers the original owner so it's a gamble + seller doesn't offer returns.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Black Dynamite posted:

TSMC is beginning volume production of 20nm for Nvidia and AMD.


Maybe there will be a high-end Maxwell card by the end of the year?

http://wccftech.com/tsmc-begins-20nm-volume-production-gpus-node/

So much for buying devils canyon. Saving my money for a 20nm 880.

Ignoarints
Nov 26, 2010
my wallet is just going to commit suicide

kode54
Nov 26, 2007

aka kuroshi
Fun Shoe

Ignoarints posted:

:lol:

edit: besides the name... it looks like it has more potential than the G10 if it has direct contact with the vrms and memory. Too bad its way uglier. Not something that usually matters to me but its a striking difference

Also finally allow me to use all the square corsair coolers

Read the rest of the article. It doesn't have contact with either the VRMs or the memory. It just sticks a fan on those and attaches it to the card's existing fan control circuitry.

Arzachel
May 12, 2012

Ignoarints posted:

Nobody, sorry, I've just gotten unusually fed up with the unprecedented rage (literally coming from a single watch dogs review) that I'm hearing from so many people, in real life, that almost don't even play games at all. This is just not unusual or weird in any way to me at all, none of this, gameworks, etc. I cannot think of another competitive industry that doesn't try to do the exact same sort of things. Not saying that it works out but they try. I think I'm just feeling that sentiment from some people, influencing things they say is all.

edit: For instance, I had to sit and nod through a co worker telling me how gsync was proof that nvidia is evil. But because AMD was doing freesync, they were consumer's best hope. I'm positive he doesn't even know what gsync does except that he can't have it. He went on to say how Mantle is a prime example because it's free for nvidia to use too (I seriously don't even know how he knows this poo poo) and I couldn't help but make the mistake of saying something like "But mantle only works with amd drivers. And is the API even released yet?" (they arent). Oh boy.

That's the worst example but I've heard grumbling from friends who dont even own a PC.

I.e watch dogs runs like poo poo on their crappy PC's, Xboxs, and PS4s. One article blaming gameworks, and nvidia is why watch dog sucks and the world is ending

Gameworks itself isn't bad but Nvidia has a great reputation for cockblocking the competition in every way they can. Remember how TWIMTBP game devs can't hand code to AMD until the very last minute to force a half baked driver ir AMD looking bad in launch day reviews? Remember how Arkham Asylum had Nvidia's MSAA implementation locked under a hwid check with Nvidia's lawyers threathening a lawsuit if it gets enabled for AMD? Remember the worlds most realistic concrete barriers and underwater oceans in Crysis 2? Everything Physx from locking you out if there's an AMD card present to up until a while ago still using x87 for the CPU code path?

You could argue that every company does this and that AMD would do the same in Nvidia's place, but that doesn't change the fact that they have a history in screwing over everyone not buying their latest and greatest. AMD just managed to push some surprisingly good PR for once.

Arzachel fucked around with this message at 07:27 on Jun 5, 2014

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
Got my eBay MSI 280X ordered. Could almost hear the sigh of resignation when my offer was accepted.

Considered a 290 instead, but all the sub-$350 models either didn't have a transferable warranty or they had the awful reference cooler (got flashbacks to the reference 4870 & 6950 - although at least with the 6950 you needed reference cards for unlocking). Kinda wish I saw those visiontek 290s before making the offer since it'd be worth replacing the cooler at that price, but too late now.

Looks like this 280X should be very quiet & cool though and it should spare me from having to strip this Accelero cooler again which is wonderful. Thanks, bitcoiners.

Christobevii3
Jul 3, 2006

Black Dynamite posted:

TSMC is beginning volume production of 20nm for Nvidia and AMD.


Maybe there will be a high-end Maxwell card by the end of the year?

http://wccftech.com/tsmc-begins-20nm-volume-production-gpus-node/

If Fermi's release is any gauge of chances I'd doubt nvidia would risk that again. If they did hold off for 20nm I'd say Q2 2015 at earliest.

Ignoarints
Nov 26, 2010

Arzachel posted:

Gameworks itself isn't bad but Nvidia has a great reputation for cockblocking the competition in every way they can. Remember how TWIMTBP game devs can't hand code to AMD until the very last minute to force a half baked driver ir AMD looking bad in launch day reviews? Remember how Arkham Asylum had Nvidia's MSAA implementation locked under a hwid check with Nvidia's lawyers threathening a lawsuit if it gets enabled for AMD? Remember the worlds most realistic concrete barriers and underwater oceans in Crysis 2? Everything Physx from locking you out if there's an AMD card present to up until a while ago still using x87 for the CPU code path?

You could argue that every company does this and that AMD would do the same in Nvidia's place, but that doesn't change the fact that they have a history in screwing over everyone not buying their latest and greatest. AMD just managed to push some surprisingly good PR for once.

I know man, I'm sorry I went off. I'm not actually defending them even though it sounds like I am. It's just so bland and normal that I am just desensitized. I'm surprised it isn't worse. It's fine to hear that, with actual examples and knowledge about the subject, it just bothers me to no end when people make the same arguments without actually knowing anything about it at all.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Arzachel posted:

You could argue that every company does this and that AMD would do the same in Nvidia's place, but that doesn't change the fact that they have a history in screwing over everyone not buying their latest and greatest. AMD just managed to push some surprisingly good PR for once.

They also do the same in the mobile space now a days.

Ignoarints
Nov 26, 2010

craig588 posted:

bios stuff

Out of curiosity are you able to edit the bios somehow? All I want it my voltage unlocked and if I can do it myself I'd happily do so

Shaocaholica
Oct 29, 2002

Fig. 5E
Is the 290x ref cooler really that bad? Link?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Shaocaholica posted:

Is the 290x ref cooler really that bad? Link?

Ever been in an airplane?

Shaocaholica
Oct 29, 2002

Fig. 5E

Don Lapre posted:

Ever been in an airplane?

So its just a noise issue? Only when the GPU is loaded?

Ignoarints
Nov 26, 2010

Shaocaholica posted:

So its just a noise issue? Only when the GPU is loaded?

Idle is almost as loud as some other cards in full throttle I think.





I'm not a quiet case guy but that seems kind of crazy loud to me

The worst part is that it's not even good



But yes its under full load. I know furmark is unrealistic, but since every card is being subjected to it I think its still telling

(if this is a mining card question, look how hot reference cards get :( I'm not sure I'd be down for that. 80 degrees or so is an anecdotal 24/7 limit, but reports of problems seem to be exponential past that made up temperature)

Ignoarints fucked around with this message at 15:34 on Jun 5, 2014

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Say if I were to get a R9 290, I'd have a "freesync" capable card already for whenever capable displays come out, right?

Combat Pretzel fucked around with this message at 15:34 on Jun 5, 2014

beejay
Apr 7, 2002

Correct, right now it's 290/290x and 260.

Josh Lyman
May 24, 2009


Shaocaholica posted:

Is the 290x ref cooler really that bad? Link?
The 290/290x blower can get fairly loud, but it's not obnoxious. Only the early releases have it though.

Shaocaholica
Oct 29, 2002

Fig. 5E

beejay posted:

Correct, right now it's 290/290x and 260.

Why is the 280(x) and 270(x) excluded???

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Josh Lyman posted:

The 290/290x blower can get fairly loud, but it's not obnoxious. Only the early releases have it though.

That's subjective. IMO 60 decibels is INCREDIBLY obnoxious. I mean, sure, I dumped a ton of money into a GPU mod so my system has a noise ceilingof ~25 dBA, but really: 60 dBA is crazy for background white noise.

Shaocaholica posted:

Why is the 280(x) and 270(x) excluded???

I have no idea, but the 290(X) and 260 are GCN 1.1 parts whereas the 280(X) and 270(X) are re-used HD 7000 (GCN 1.0) parts. Maybe that's it?

Factory Factory fucked around with this message at 16:14 on Jun 5, 2014

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Shaocaholica posted:

So its just a noise issue? Only when the GPU is loaded?

The cooler is so lovely that the card will throttle sometimes even at stock clocks, not by a large amount but it's still pretty annoying that it's even happening. I'd still go for a card with a reference cooler if it was a good enough deal but I'd try to get one with a third party cooler if at all possible. People make a big deal about the 95C temp that it maintains but really I think the noise and throttling are much bigger issues.

Shaocaholica
Oct 29, 2002

Fig. 5E
So I guess with adaptive sync/g-sync you can still have micro stutter in SLI/CF configurations right? Something like 2 frames very fast, then a delay, then 2 frames again. The display at least display the frames as they come without having to shove them into a predefined frequency but that doesn't eliminate the fact that the frames are being rendered with a non uniform cadence due to SLI/CF. Maybe it won't be human perceptible with async.

Ignoarints
Nov 26, 2010

kode54 posted:

Read the rest of the article. It doesn't have contact with either the VRMs or the memory. It just sticks a fan on those and attaches it to the card's existing fan control circuitry.

I missed this. I've read that it does, although it's not very clear about it. How is it supposed to reduce VRM temps by 25 degrees over stock if it doesn't though? This would be a big deal

quote:

The most notable feature of the Corsair HG10 is that it uses the stock-cooler reference fan and a heatsink to cool the VRAM and VRM

quote:

fter installing the HG10 GPU bracket, a PC enthusiast can mount any Corsair Hydro Series liquid CPU cooler (available separately) to chill their graphics card's GPU (up to 50°C) and VRM/VRAM (up to 25°C) lower than stock temperatures.

Sucks it won't be available for me for a while but will give me time to pick a drat case I guess

beejay
Apr 7, 2002

Factory Factory posted:

I have no idea, but the 290(X) and 260 are GCN 1.1 parts whereas the 280(X) and 270(X) are re-used HD 7000 (GCN 1.0) parts. Maybe that's it?

Yeah this seems to be it. I'm not sure of the technological reason but I assume there must be one.

Josh Lyman
May 24, 2009


Factory Factory posted:

That's subjective. IMO 60 decibels is INCREDIBLY obnoxious. I mean, sure, I dumped a ton of money into a GPU mod so my system has a noise ceilingof ~25 dBA, but really: 60 dBA is crazy for background white noise.
Sure, noise is subjective, but having heard an R9 in person, it wasn't as bad as I thought it might be.

Ffycchi
Jun 4, 2014

Sigh...challenge accepted...shitty photoshop incoming.
think its worth buying a second 780ti or waiting for maxwell?

veedubfreak
Apr 2, 2005

by Smythe
^^^Is your current 780 not enough for what you play?

Josh Lyman posted:

The 290/290x blower can get fairly loud, but it's not obnoxious. Only the early releases have it though.

If you replace the tim on the reference cards they are much more efficient and don't have to ramp up nearly as much.

Ignoarints
Nov 26, 2010

Ffycchi posted:

think its worth buying a second 780ti or waiting for maxwell?

I'm thinking about holding off an SLI too now that there is some kind of maxwell news. At least this way I can dump some money into devils canyon ... although I know in reality a second 780ti will give me actual results where devils canyon might be more for "haha I have a 5 ghz processor"

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

veedubfreak posted:

If you replace the tim on the reference cards they are much more efficient and don't have to ramp up nearly as much.
They've said that every generation since the 4000-series blowers were released. Replacing the TIM may keep it from ramping up as much, but they're still (subjectively) very noticeable at 40%-50% and incredibly noisy at 100%. The sudden jumps you'll get in stock fan curves usually don't help much either. Changing the TIM might help the peaks but it's still a noisy cooler design overall, so I'd rather use aftermarket cooling parts or at least have the card come with it's own custom cooler setup. I could easily sleep by my current card at 100% if I ever needed to do so, whereas that never would've happened with the stock vacuum cleaner-blower at 100%.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
The current bunch of non-budget cards can all drive more than two displays, right? This isn't just a thing with specific models? Technically, I'd just need to run two displays and clone the primary to a third output. I'm wondering about this because I'd like to drive that Oculus Rift without plugging things in and out all the time, and I have already two displays.

That said, is there a performance impact to a cloned display? Do graphics cards clone the electrical signal, or do they maintain separate framebuffers that have to be copied data to?

Shaocaholica
Oct 29, 2002

Fig. 5E
A copied frame buffer isn't all that much vram so I don't think it would matter. In theory it could be 2 pointers to the same data but then you can't overlay anything different on each one and I'm not sure if that's a necessary functionality that need to be kept.

veedubfreak
Apr 2, 2005

by Smythe

cisco privilege posted:

They've said that every generation since the 4000-series blowers were released. Replacing the TIM may keep it from ramping up as much, but they're still (subjectively) very noticeable at 40%-50% and incredibly noisy at 100%. The sudden jumps you'll get in stock fan curves usually don't help much either. Changing the TIM might help the peaks but it's still a noisy cooler design overall, so I'd rather use aftermarket cooling parts or at least have the card come with it's own custom cooler setup. I could easily sleep by my current card at 100% if I ever needed to do so, whereas that never would've happened with the stock vacuum cleaner-blower at 100%.

Oh trust me, I know how loud the 290s are. I ran my 290s for a day or 2 before my waterblocks showed up. There's a reason I bought the biggest case I could find, the only noise my computer makes is the burble of the water going back into the res because it needs to be filled back up :)

Adbot
ADBOT LOVES YOU

Ignoarints
Nov 26, 2010
Jeez I just realized the Corsair HG10 doesn't come with a fan at all, despite reading all the references that it uses the stock reference blower. Back to NZXT I guess and gluing little heatsinks on my stuff

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply