Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Sir Unimaginative posted:

At least there are probably things AMD can still do to improve their GPUs; nVidia may have had their Sandy Bridge moment but ATI hasn't had a post-PhenomII stall yet.

How can you come to this conclusion? Legitimate question, I'm not seeing how nvidia can't get more out of their GPUs, and it honestly looks like the Radeons are heading for "super" Fermi territory. TBH, I am ignorant of the details, which is why I ask.

Adbot
ADBOT LOVES YOU

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

At the very least, we know that AMD is about to leave nVidia in the dust with regards to memory bandwidth. The next high-end cards that AMD will release will feature Hynix's stacked memory modules. You may remember several years back when nVidia was touting their own stacked memory products which were then delayed to Pascal. This is because their own version of non-planar memory, which they were working on with Micron and calling Hybrid Memory Cube (HMC) fell through and the project was dropped. They will now shoot to use the joint AMD-Hynix High Bandwidth Memory (HBM) instead. The first generation of HBM products which is touted to, on paper, permit memory bandwidth 1 gigabit wide. The second generation of the memory standard, which they are working on presently, seeks to double this, as well as capacity per-stack.

For comparison's sake, a GTX 980 has only a 256-bit memory bus. A Titan Black only has a 384-bit memory bus. (The Titan Z also only has a 768-bit memory bus, but that's a dual-GPU card, so still, 384-bit per-GPU.)

All the memory efficiency and color compression touted by Maxwell is about to be obsoleted by this advancement.

We should not expect 1:1 performance improvements due to the increase in bandwidth, by which I mean that the full-fat Fiji XT should not suddenly triple the performance of a Titan Black. But it gives AMD a serious leg up, as nVidia will not bring non-planar memory products to the market for at least one year.

I thought only the R9 390s were to be HBM and everything else was using standard GDDR5, which explained the rumored TDP of the 380(X)? I mean, maybe they capture the enthusiast market for a year and half, Nvidia can just reclaim it with the 1000 series, correct? Are they at such a legitimate technological disadvantage that Nvidia could rollout first gen HBM on the 1000s just as AMD drops second gen cards with better thermals/consumption?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I guess this is more a GPU question than a monitor question, but as far as picking a new monitor, does having a displayport matter in the context of the next generation of GPUs with HBM? I'm guessing that even modern or predicted GPU won't bottleneck on DVI or HDMI yet but I wanted to be sure.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Subjunctive posted:

I'm sure they'll learn their lesson when this fracas causes people to flock away from their cards into the market-snuggling embrace of AMD.

Can't tell if this is sarcastic or not, despite SADs hope that HBM provides AMD with some temporary advantage, there is a reason there is a "deathwatch" thread.

I'd almost toxx on the 300 series cards flubbing and Zen being the guillotine.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Nvidia learning a lesson, anyone buying AMD cards, or just "Yes"?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

Well, stacked memory was *supposed* to be Pascal's thing. But since HMC went nowhere and they're having to use AMD-Hynix's HBM, I think parity will last a little longer than that.

Dude, AMD is [soon to be] dead, just let it go.

AMD basically wraps it up, 2017. I'll toxx on this if you want to set the rules because I'll gladly eat a ban to be wrong, but I'm pretty sure I won't.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

Well, my gut feeling is that AMD as a whole company is on the turnaround. I feel that their CPU products have been waiting on technology to catch up with their ideas, and that their GPU products are fine. Not exceptional, but fine. But really, whether or not you eat a ban is entirely up to you. Don't let me stand in the way of your committing sudoku.

Really though, My investment in seeing AMD survive is purely from the "everyone wins when there's competition" angle. Don't tell me that NVidia would keep pushing their R&D to make better tech to get ahead of everyone else if AMD suddenly ceased to exist. That's not how for-profit companies work. I'll buy either company's cards, depending on which provides me the best price/performance ratio. To date, there have been no vendor-specific features that I have absolutely had to have. In flipping back and forth between ATI/AMD and Nvidia every new card (that's just how it winds up) my only preference has been that AMD handles switching and activating/deactivating multiple monitors better than Nvidia, because they put all those controls right there in the system tray icon.

But really, you should be rooting for AMD too, because otherwise we'll just have another Intel lazily incrementing their process tech instead of looking for that next big technological advancement.

veedubfreak posted:

I look forward to the day that AMD goes under and Nvidia starts selling GPUs for no less than 500 bucks. Lets all root for the failure of AMD, monopolies always work out alright for the consumer.

You guys are behaving like somehow that monopoly hasn't defacto existed for awhile and that AMD hasn't just been propped up like a dead corpse by miners and fools. 300 series will flop, Zen won't match skylake and the 400 series won't compete with the more capable Geforce 1000 cards. AMD will die and no one will notice and things will continue on as before because the demands from high end users will continue to put pressure on Intel and Nvidia to innovate at the same or better price point. The death of AMD is not the techpocalypse.

AMD is already VIA, they are already noncompetitive. I could root for AMD until I turn loving blue, that basically means nothing as far as them being a good company which can remain competitive.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Rastor posted:

It would just be nice if AMD would get with the process stepping, aren't all their CPUs and GPUs on 28nm processes?

I don't see the point of a steamroller AM3+ 28nm CPU?

Kazinsal posted:

Their CPUs are worse. The FX series is still on 32nm (:aaa:) and their APUs won't be 28nm until Q2 this year (:aaaaa:)

Kabini/Kaveri is 28nm???

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Here's a question - exactly how much would say, a R9 370X have to outperform a GTX 970 for many to consider it, where the power consumption and heat are pointless factors compared to performance. IIRC, the 380X leaked benches indicate better performance than the GTX 980, so it looks like the 370X will be competing with the 970.

Does it have to be equal? Or can they manage 2:3 ratio between performance/consumption and it'd still be worth it?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

If power consumption, heat, and noise are pointless factors compared to performance, then whichever one delivers a better performance/price ratio once shipping and taxes are included is the one I'll buy.

Not pointless factors, just when performance is the overriding factor for a given price/TDP.

Is the heat worth it if it has .95:1 performance to the 970 and costs 250$? Maybe I am phrasing the question incorrectly - how much is a low TDP or power consumption worth? It doesn't seem entirely subjective, I remember the Fermi poo poo storm and this seems to be the same situation but reversed, yet people still bought Fermi cards even when the 5XX0 series was competitive in performance but the noise/TDP were definitely in AMDs favor.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Based on that graph, AMD is already dead. It's not like they're going to recover with the 300 series, so basically the entire market will get captured by Nvidia, especially since any fabled 400 series would have to pull a complete rabbit out of a hat bullshit to be competitive. They slipped into a coma early 2010 and it looks like life support is getting cut late 2015.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

HalloKitty posted:

Aren't you just willing that to happen, though? Because you toxxed yourself based on it. There's nothing massively wrong with AMD GPUs. The 290 and 290X are somewhat power hungry, but the main reason for their decline is people constantly saying things like "oh, NVIDIA drivers are the best", and only recommending NVIDIA.

AMD CPUs on the other hand are a desperate cause.

I didn't do a full toxx because Swiss Army Druid didn't set the parameters.

I wouldn't say Nvidia drivers are the best, I've had problems with them being just as finicky as AMD, and the only card to ever die on me was an 8400GS in a media center, otherwise Nvidia or AMD, the cards have just worked. The issue is performance per watt, and AMD simply cannot keep up. I know we want to keep talking about how AMD will survive like some low price cockroach but there comes a point where the added expense in trying to support an AMD card outweighs any benefit both to the OEM and AMD.

The 300 series isn't promising anything good, and since that's an easier and quicker to resolve toxx for now, :toxx: I'll buy an avatar/username combo of the mods picking if the 300 series beats the 200 series in performance per watt while having a better price/performance ratio than comparable Nvidia cards. This isn't an impossible standard to set, correct?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Subjunctive posted:

Comparing worst 300 to best 200? Mean of the whole lines?

I'm thinking of comparing an R7 350 to an R7 250, R7 360 to R7 260, etc for performance/watt improvement, and equivalent market GPUs between Nvidia and AMD, so 390/980, 380/970, 370/960, etc for whether or not the buy is worth it. Maybe factor in the added expense of increased power usage by requiring beefier PSUs?

If this is a good metric to go by then sure, but what I'm really trying to toxx on is whether AMD is going to make a competitive product without destroying their margins to achieve that. Maybe I am being too specific on this and should just go with reviews, but eh, it'll only be a 20$ idiot tax on my Radeon purchase.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I know it's a kind of ironic speculation that the 390 will have fast and slow RAM, but wouldn't it be more correct to say it's fast and faster RAM? Would we even notice a bottleneck if it needs to switch over to GDDR5?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Could they actually get away with more than 4GB of HBM though?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Can't one make a CLC that's fairly compact however?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

And at no point in my post did I say AMD, Nvidia, Matrox, S3 or Intel. As a matter of fact, I specifically constructed that post to be as neutral as possible!

Your biases are showing~ =P

Could Samsung please buy VIA out? VIA is poo poo and it's kinda sad to see them as a kind of graveyard for tech companies that once mattered. I wouldn't mind seeing a third competitor in the GPU market again. Or did most of the guys who made S3 basically bail to other companies and VIA just holding onto a name?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I guess Gigabytes gimmick is terrible design decisions.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

cisco privilege posted:

All that trouble to put tubes on the board and then they pair it with a VIA chipset.

I'll repeat myself, but someone really needs to buy VIA so all the poo poo they bought doesn't waste away forever.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

Oh, I'm sure somebody will! Those patents are all probably worth *something*. Even if they just pull a Motorola just to get at the patents and then resell them a year or two later.

If there is any institutional knowledge left in VIA they'd be more useful than for just mere patents. Whoever buys them might be able to throw themselves into the game between Intel/Nvidia/AMD, and I wouldn't mind some kind of "safety lever" brand should someone go completely tits.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Could AMD do a "refresh" of the 300 series by shrinking the die down to 20nm Q4 2015, Q1 2016 and then release the 400 series in Q3/4 2016?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

400 series is called "Arctic Islands". Hopefully this is a hint that the architecture will be worked on pull a Maxwell on GCN. If there is no shrink between now and the 400 series, it is a foregone conclusion that the 400-series WILL be on a smaller process node.

I'm uncertain if we'll see any refresh products, either. AMD appears to be refreshing as part of their new series launches. Their strategy now appears to be:

-90/X: New silicon.
-80/X: Last year's -90/X silicon, refreshed and rebadged.
-70/X: New silicon.
-60/X: Last Year's -70/X silicon, refreshed and rebadged.

We only saw one mid-cycle refresh part this past year, relative to 3 in the 7000-series.

I was thinking more along the lines of the R9 285 type refresh, maybe just implementing arctic island features into a 20nm pirate island silicon, potentially to generate hype on like two new cards or at least have some kind of offering if Nvidia pushes Pascal too fast.

Desuwa posted:

From what I understand they're even hotter and more power hungry, at least the one (the 390x?) where their reference design includes a small CLC. Unless that wasn't real, it is pretty hard to swallow.

Honestly, I've been thinking about that, and isn't HBM supposed to offer significant advantages in energy usage? Wouldn't this create some kind of fairly large discrepancy between a 380 and a 390, where either the 390 is monstrously more powerful or the 380 manages to be more power hungry (or both even), which is odd, and if true would kind kill the sales for 380s.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

calusari posted:

hopefully the clc is just reference and AMD's partners can still make custom air cooled designs

AMD is solving there lovely reference designs with a hammer I see.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

Wait, so they went Titan, Titan Black, Titan Z, and now it's Titan X? Jesus, guys, get your loving naming convention together.

They did, you see each stage of the naming process is more EXTREME for your REAL GAMER needs. I'm waiting on Titan Alpha, Titan Omega and Titan Tits and Explosions. It's the literary form of putting a gun shaped heatsink on your motherboard.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Maybe it was posted in this thread and I am too retarded to find it, but is there any hint about any of the mid range (380, 370) cards performance? I want to hazard a guess and say the 370X and GTX960 are comparable based on the theorized pricing, so the 380 looks to compete with the 970?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Hace posted:

Rumors say that you can expect some more rebadges

The 370s should be new silicon apparently, so I am most interested in their performance, since I am looking for a replacement for the GTX 760 I currently have and am waiting for AMD to reveal their cards, literally and metaphorically.

veedubfreak posted:

High Bandwidth High Bandwidth Memory Video Memory?

Interesting to note "Up to 8GB", so a 4GB at a lower price point is possible, maybe even without a waterblock as it doesn't seem to list it as a definitive, but rather as a special edition. It's likely the difference between a 390X and 390 though.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Zigmidge posted:

Oh. Well. That was easy, I guess. Thanks guys.


I'd try the AMD but I am so not prepared to jump back in that boat after the last time I owned something that needed their drivers.

I personally never had an issue with AMD drivers, even with something like getting an old X1950XTX to work in Win8. I always felt it was down to build quality, as I've stuck personally to XFX, ASUS, EVGA, MSI and Sapphire, and the only time I've had something poo poo itself was a cheap rear end PNY or BFG card.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

I had to look at that price chart again, and you're right. Does this mean that the 390 comes with some HBM? Like, in a 4 GB HBM + 4 GB GDDR5 configuration? Interesting...

No, not based on that spec image posted just upthread, it seems AMD is saying up to 8GB of HBM on a 390 card. I'm sure there might be some HBM+GDDR5 combination, but it appears AMD has been able to hit the 8GB for 1st gen HBM.

It's entirely possible that if everything is new silicon, then all chips might have varying degrees of HBM on them (pipedream end of scale), or at the very least the 390s have HBM. I am very eager for June, it's either going to be a disappointment or I'll be spending 20$ extra for an AMD card.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

veedubfreak posted:

Announced in June.
Available in July.
Enough stock to find one in August.

If the 290x launch taught me anything.

If they flop, no one will buy them, if they're a hit, no one can buy them. Goddammit AMD.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
The 300 watt TDP is still also a rumor correct?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

beejay posted:

Additionally, FreeSync can handle down to 9Hz/fps, but the monitors that are currently out only go down as far as 40Hz.

Is this confirmed somewhere? Because if so then maturity on Freesync makes Gsync pointless - no one is going to be playing games or watching things at 9 frames per second (no one sane).

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Even if 9-240Hz is misleading, the ranges provided for a Freesync monitor still make Gsync pointless, but only if the monitor quality required doesn't remove the cost savings. I'm just not seeing Gsyncs advantage personally :shrug:

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
If desktop Skylake is Q1 2016 that's a really awkward time to release considering 2016 is when AMD is dropping K12 and Zen, both of which will have HMB2 while Skylake, AFAIK does not and I'm not sure how soon Intel would be willing to just drop another iteration of PIII Core with HBM to compete with that. If Zen uarch is competitive with Core and comes with a gigantic 4-8GB L3 cache that's a lot of market capture.

Err, this post really should be in the AMD thread :v:

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

BIG HEADLINE posted:

The only thing I'm really waiting for is Pascal and NVLink. Other than that, I'm also happy with my 2500K @ 4.4Ghz for ~3 years now.

NVLink being a thing is er, bad? That's either the death of AMD or a bunch of dumb proprietary interfaces flying around because like hell AMD will use NVLink. Nvidia is hoping NVLink is good because that's them forcing AMD out of the market, like Gsync.


I'll never get why AMD dropped support for 4000 or older cards. "We've optimized the best we can" is okay and all, but how about updating them so they work in new Windows environments? It really shouldn't be a pain in the rear end to get an X850XT or X1900XTX to work in a Win 8 environment when old Geforce bullshit works, what the hell AMD.

Likely dumb question, but what are the baseline requirements for Vulkan to work, any idea? It doesn't seem GCN or DX11/12 bound so...would it work with a ye olde GTX280 or HD4870? A Geforce FX5950?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

GrizzlyCow posted:

Vulkan is compatible with any GPU that supports OpenGL ES 3.1 or higher. I think OpenGL 4.3 is a requirement too, but I'm not too sure.

NVLink will have a hard time taking off if NVIDIA doesn't Intel on the idea first. Unless they're literally planning to go over Intel's head with this interface. Also, is there any reason other than licensing that AMD couldn't jump onboard the NVLink train?

I must be dumb, I'm looking at AMD card specs and I'm not finding the OpenGL ES support, and sometimes even OpenGL. The only thing I've found hints that the 5000 series would be the bottom tier for AMD.

And Nvidia wouldn't stop and take a moment to realize they could essentially kill a competitor with a new standard? Refuse the license, Nvidia has a commanding lead in market share, make the license costly so that AMD can't be price competitive, etc.

BIG HEADLINE posted:

I think the 'fear' being bandied about regarding NVLink is unfounded. There's no way it's not going to be a glorified 'bolt-on' technology that will probably add $50-75 more to the cost of a motherboard that only has PCIe 3.0/4.0 on it, which means nVidia's going to be forced to make the first-gen Pascal GPUs for both NVLink *and* PCIe, so it'll be a non-issue at first.

So Nvidia then forever holds the GPU crown due to proprietary interface and AMD weeps for their silicon? NVLink promises to make PCIe another AGP.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Maybe I am being hyperbolic, but when Nvidia controls almost 80% of GPU market share and it's only looking to get better for them, at what point does the decision to choose become rhetorical? You're making AMD an attractive midtier enthusiast option by Nvidia essentially loving themselves with NVLink somehow, when really most people are going to choose Nvidia to begin with so if you're already buying a high end Nvidia card, why not buy a board with NVLink and "futureproof".

Maybe the sky isn't falling, or won't fall, but I'm not seeing how Nvidia won't try to use NVLink to seriously hurt AMD until Intel sets a new board standard. They've got AMD trapped in a corner, time to bring the hammer down and finish the job.

I've also toxxed on AMD being a gigantic gently caress up for 2016 so maybe I have ulterior motives.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Rastor posted:

This is sort of interesting: ASUS is releasing a GTX 970 with a blower cooler and calling it a "GTX 970 Turbo":



http://www.techpowerup.com/211252/asus-announces-geforce-gtx-970-turbo-graphics-card.html

This card is apparently made for dweebs like me, who like their rigs to be composed of the same thematic colors and LEDs and get upset when it's a mash of clashing elements. I can see this going into someone Red-White-Black and White LED case.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

...you mean you don't paint your own components?

I am poor and clumsy, it's less expensive to be picky then gently caress up a component with no warranty.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
That might actually be a real 390X, wasn't the rumor mill saying no single chip flagship? It appears to have two separate heatsinks, so maybe the TDP is under enough control that watercooling is pure reference but it's "easy" enough to do non-reference aircooling designs. That's better than what the 295s got away with, and this is supposed to stomp the poo poo out of the 295. Also, the TDP seems to be a significant improvement from the 295s 500w to the 390Xs 300w.

Adbot
ADBOT LOVES YOU

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

Or it might not. I'm seeing stuff say that it's actually a 380X, with blame placed on the potato-quality photo.

Yeah, the photoquality here is not helping, but the font used is what's really throwing everything. Any worse quality and 6, 8, and 9 would be indistinguishable.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply