Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Truga
May 4, 2014
Lipstick Apathy
Has this been poster here yet?

https://www.youtube.com/watch?v=spZJrsssPA0

It's pretty good IMO.

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy
With decent custom coolers vendors are now installing onto cards, noise is a non-factor for both nvidia and ati unless you live on the sun I thought?

Power usage might be a factor if you live in some backwater country like Germany or Netherlands, where their ~green power~ policy caused the power prices to skyrocket to really stupid levels. Or if you do something that loads your pc 24/7, because a couple hours of 200W per week for gaming isn't going to be noticeable.

e: Or you're a datacenter with heat issues.

Truga
May 4, 2014
Lipstick Apathy

Darkpriest667 posted:

I swear to god the privacy paranoia is getting to tinfoil levels.

Is it paranoia if I'm right?

Truga
May 4, 2014
Lipstick Apathy
Yeah, I was gonna get 2 970s to run my 30", and now I don't think I will over this. 3.5gb might be just right for 1440p, but people with 2560x1600 say it can bog down in some games. I think I'll just wait for R300 and see what happens to 980 prices or if a r300 is a better deal.

Truga
May 4, 2014
Lipstick Apathy
I'm looking out for something that will deliver ~100fps in most games so I can just buy and plug a consumer rift in when it arrives. A single 980 doesn't quite cut it, but two 970 do IIRC. I'm definitely going to wait a bit longer now though, just to see what happens. Last couple months or so I haven't been playing anything graphically heavy enough to warrant an upgrade anyway.

Truga
May 4, 2014
Lipstick Apathy
http://www.tomshardware.com/news/microsoft-directx12-amd-nvidia,28606.html

quote:

Exclusive: DirectX 12 Will Allow Multi-GPU Between GeForce And Radeon
I just saw this. Is this legit? I want to think it is because it's tom's hardware but...

Truga
May 4, 2014
Lipstick Apathy

HalloKitty posted:

Not if NVIDIA has anything to do with it; they're downright petty when it comes to that, such as not allowing an NVIDIA card to run as a PhysX card if an AMD card is installed.

I was just going to google about this. They're still doing this then? I guess I'll keep buying ATi, until now I've never had issues with their cards/drivers people keep whining about, and they seem to not be giant shitlords.

E: Further reading into that article, apparently they want to treat multiple gpus as one single gpu with a lot of power to render every frame. That sounds all kinds of cool and awesome, but will there be a non-poo poo engine that does it in the next 10 years?

Party Plane Jones posted:

Wasn't there some bullshit thing with AA in specific games (like Batman) being forced off if Ati cards were detected?

Google says yes. The way it's meant to be played, surely.

Truga fucked around with this message at 13:30 on Feb 25, 2015

Truga
May 4, 2014
Lipstick Apathy
I honestly don't have issues with ATi/AMD GPUs though? The first and last time was when I got my first ever job and treated myself to a brand spanking new 4870x2 fresh out of the factory, and quickly found out Gothic 3 didn't work well with it, some trees glitching, and AA not working. (all of which got patched in the first catalyst update ~2 weeks later, only to find out it's actually a bad game). Before and since that, though, basically no issues.

I'm not going to say anything about their CPUs because they're poo poo tier right now, and have been for a while, but GPUs seem decent to me? Am I just super lucky and playing the exactly right games or something?

Truga
May 4, 2014
Lipstick Apathy
From what I got it'll be like r9 2xx release. R9 390 and 390x are a new thing, 380/x are previous 290/x, etc.

But yeah, I hope they can get down to less than 28nm soon, this is getting silly. And the market share thing is definitely not good.

Truga
May 4, 2014
Lipstick Apathy

HalloKitty posted:

It doesn't matter really, as long as the perf/watt is competitive. If you had a 300W card with perf/watt like Maxwell, that's a loving beast.

If that thing happens, I'd actually stop delaying upgrading and just upgrade.

Unfortunately, it won't happen :(

Truga
May 4, 2014
Lipstick Apathy
You can do that if you have the slots.

Truga
May 4, 2014
Lipstick Apathy



Dual GPU cards have come a long way in the last couple years.

Truga
May 4, 2014
Lipstick Apathy
People are saying 380 is a rebranded 290. Probably slightly higher clocks, so yeah around the 970 mark.

Truga
May 4, 2014
Lipstick Apathy
I read somewhere that radeons use more power, but put less stress on other parts so you end up with a total power consumption quite a bit lower than simply the difference between the nvidia and radeon card.

If that's true or not I have no idea, and I can't remember where I read that. I'll try to find the source.

E: Can't find it now so it might just be something I read on a forum. Probably bogus.

Truga fucked around with this message at 13:18 on Mar 18, 2015

Truga
May 4, 2014
Lipstick Apathy
I love EK because everything I order there arrives on the next day since their shop is in my town :3:

Truga
May 4, 2014
Lipstick Apathy

KakerMix posted:

Games that aren't built to take advantage of SLI will not use the second card, yes. Many times the game will perform worse with SLI mode on then with it off if they do not have SLI support.
You can force games to use SLI using other game's profiles yes, sometimes it can work great but I never had any luck on the games I play.

And people tell me ati has bad drivers :laffo:

beejay posted:

I guess my point is, with adaptive sync technologies, falling below ~40fps will basically be the same situation as 60Hz monitors now when below 60fps?

Yes.

Truga fucked around with this message at 16:58 on Mar 20, 2015

Truga
May 4, 2014
Lipstick Apathy

KakerMix posted:

It's the same thing with ATI cards and multi gpu rendering :ssh:

:allears:

The only game I play where crossfire actually dropps my performance is DCS, which is a 15 year old soviet pile of poo poo engine (also, runs fine on a single card), and in a sizeable section of games I play it straight out won't work, but any game it doesn't work in is also either old or features minecraft graphics, and thus runs okay, and in the rest it gives me an FPS boost that was simply not available in a single GPU scenario when I bought.

Is this just another case where I just had insane luck with it for the last couple years, while everyone else with ATI cards has their computer blow up as soon as it's plugged in or drivers installed?

I mean, don't get me wrong, more than a couple years ago the crossfire experience was absolutely poo poo in almost anything (same with SLI, really), but these days, it just works for me everywhere I actually need it. I had no experience with SLI lately, but I also very much doubt it's as poo poo as you make it out to be?

Truga
May 4, 2014
Lipstick Apathy
I don't even get why this is an issue though. Cards are already doing what's being described here now, don't they? It's not like if your framerate drops to 3 for whatever reason, your monitor suddenly goes blank/flicker?

Truga
May 4, 2014
Lipstick Apathy

sincx posted:

Any updated rumors on the R9 380/390(X)? I think people were talking late Q1 or early Q2 a few months ago, and that's obviously gone out of the window.

Delayed to summer a month or so ago.

Truga
May 4, 2014
Lipstick Apathy
http://www.guru3d.com/news-story/meet-the-xfx-radeon-r9-390-double-dissipation.html

So this floated up. Is it just me or does that thing look giant?

Truga
May 4, 2014
Lipstick Apathy
That's the thing, check the 2nd pic. It's a dual slot card.

Truga
May 4, 2014
Lipstick Apathy
:laugh:

Truga
May 4, 2014
Lipstick Apathy
Yeah, don't buy 295x2, just get 2x290. You'll have less than 10% lower stock performance and way more overclock headroom, for nearly 20% less cash. Unless you absolutely require single slot 4k, in which case 295x2 trashes everything else but

Truga
May 4, 2014
Lipstick Apathy
:cripes: 8k benchmarks.

e: Yes, radeons always have been better at high resolutions, but nobody has 8k monitors right now you idiot.

Truga
May 4, 2014
Lipstick Apathy






At 4k 970 performance seems to drop off even below 290 at times. 3.5GB? :v:



E: Also, I see SLI is pretty much the same as crossfire. Awesome fps boost at high resolution, minor at lower.

Truga fucked around with this message at 09:22 on Apr 18, 2015

Truga
May 4, 2014
Lipstick Apathy

Paul MaudDib posted:

295x2 isn't single slot, it's double slot IIRC.

Sorry, I meant single pcie slot. Should have specified.

Truga
May 4, 2014
Lipstick Apathy
A couple months ago I'd still say go with 970 for 1440p.

With gta 5 hitting 3GB vram used in 1080p and 970 effectively being 3.5GB card, I'm not so sure any longer. :v:

Truga
May 4, 2014
Lipstick Apathy

Gwaihir posted:

With all the options cranked on GTA5 (Other than AA) it still only uses 3.5 gigs of vram on my 2560 * 1600 monitor, measured from afterburner rather than the ingame estimator.

That's exactly my point. What happens when witcher 3 (or some other high profile game) releases and uses 3.8?

I thankfully don't need to upgrade yet so I can just wait and see on this issue (and hopefully catch a price drop when 980Ti and 390 release)

Truga
May 4, 2014
Lipstick Apathy
https://www.youtube.com/watch?v=BdwUsalwBJ8

:ssh:

Truga
May 4, 2014
Lipstick Apathy
Unfortunately, 4k seems to be 30hz only :<

Truga
May 4, 2014
Lipstick Apathy

FaustianQ posted:

DX12 might in theory improve performance on older multicore machines, yea, nay?

It will in theory.

In reality, the developers will just ramp up the draw calls instead and your old multicore machine will still run the game like poo poo. And no older games are ever getting ported to dx12, because that'd take :effort:

Truga
May 4, 2014
Lipstick Apathy

veedubfreak posted:

Oh, just a quick note, about multi card. I have been playing MW:O for about 2.5 years at this point and have been through a 690, xfire 7970, xfire 290x, sli 970, sli 980 and now my Titan X. The Titan X finally gives me a solid 50-60 (limited to 59 in nvidia) fps now in that game at very high (max) settings at 7880x1440. The SLI 980s couldn't even manage it.

Bold mine. I think there's your problem. MWO is a piece of poo poo software. If you check their forums, tons of people there have problems with it, people without sli too. And crossfire/sli is not even officially supported.

On the other hand, lately I've been playing gw2/ff14 mostly, and the difference between crossfire off/on in ff14 is 25-35 and 50-60 fps, and in gw2 I go from ~35 minimum to an almost constant 60. And this is on two 6950s. That's over 4 years old now? I'm sure newer stuff works even better.

IMO, it's very silly to base your experience on one game and then generalize, because it'll never paint the right picture. Instead, just check the numbers on the internets for the games you play, decide based on that. If I was playing mostly MWO I'd probably ditch crossfire too. But not everyone does.

It sucks that sli/crossfire isn't supported in your game, but that's just one game out of many. I get craptons of mileage out of my CF setup, and 99% of the time games that don't support it don't need it either. And if they do need CF at my resolution and don't support it, they can go gently caress themselves, frankly.

Truga fucked around with this message at 10:12 on Apr 23, 2015

Truga
May 4, 2014
Lipstick Apathy
Usually price drops as the ~next gen~ hits from whichever side.

Truga
May 4, 2014
Lipstick Apathy
Well, nvidia's next gen was the 900 series, so unless AMD's 300 series is better price/performance, not much will happen to prices. If it's better, nvidia will drop prices.

Same goes for other way around. If nvidia drops something that's better than AMD's current cards, AMD will drop prices to compete.

Truga
May 4, 2014
Lipstick Apathy

veedubfreak posted:

To add to that, didn't the 390x get pushed to summer at this point?

Yep, reveal in early June, which probably means July/August for buying?

Truga
May 4, 2014
Lipstick Apathy
I think nvidia's gonna do new numbers or skip to 1100s, because having a nvidia 1080 when 1440 and 4k is all the rage will make their sales dept flip their poo poo.

Truga
May 4, 2014
Lipstick Apathy
Dunno if this has been posted yet but....

http://www.kitguru.net/components/graphic-cards/anton-shilov/supply-of-amd-radeon-r9-390-series-may-be-constrained-report/

Apparently, 390 supply might be low for a while even after release, because of slow HBM production.

Truga
May 4, 2014
Lipstick Apathy
Really makes you wonder what AMD was doing in those years though. Radeons were the best thing to mine bitcoins on for at least 2 years. Some altcoins are still mined best with GPUs, so I'm sure some cretins still buy them for that reason.

But AMD couldn't manage to get more cards out there? Sounds odd.

e: I bought my 6950s to game on, spent 400 euros on them, which seemed like a lot at the time.

Then they mined 100 bitcoins. :v:

Truga
May 4, 2014
Lipstick Apathy
I have 8 coins left just in case. Cashed in the rest, when they first hit 500 or so.

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy
When I last cared about gpu decoders, it was an entirely separate chip on the card. Probably still is.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply