Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Tanreall posted:

At least we'll still have freesync adaptive-sync in this future.

If anyone besides AMD picks it up (which is likely because it's just a good idea) Nvidia will probably treat it like CUDA vs OpenCL. They'll end up supporting both but heavily promoting G-Sync.

At least that's what I told myself when getting a Freesync monitor.

Adbot
ADBOT LOVES YOU

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Mutation posted:

It was a really bad idea to put all their effort into creating a luxury GPU and rebranding their rebrands.

Not having a new arch for the mainstream where the sales lie, since 2012, surely hurts. Kinda doubt their profits per sold unit are comparable to Nvidia too.

Price/perf they're not in a bad spot but nobody *really* likes buying old tech in the gaming market.

Also what was up with letting every site review the R9 390X instead of the 390 which sits at a very competitive price/perf point.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

BurritoJustice posted:

Not that a single freesync/adaptive-sync implementation has come close to going that low.

Have a player that doubles every frame to output 48Hz and get the panel to sync to that.

I'm 100% serious BTW, that's how G-Sync works. AMD should get their drivers to work the same way.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.
You can't add up the RAM in CrossFire/SLI though, both boards need a copy of the textures at hand.

Squeezing more out of the memory wouldn't be impossible given that AMD's GCN 1.2 color compression is less efficient than NVidias. But that likely needs new hardware.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.
Adaptive Sync and FreeSync are the exact same thing on the monitor side, though. FreeSync monitors are just adaptive sync ones taking a ride on AMD marketing.

This was 100% predictable as Adaptive Sync is a totally free standard and Intel can't use G-Sync.

I'm not sure what else Intel could have done. They can't ignore important tech advances when they're pushing their GPUs up towards the middle end of the market.

Nvidia will likely treat it like OpenCL: add support but keep promoting the solution that forces vendor lock-in as superior.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

HalloKitty posted:

To be fair, NVIDIA would still dictate the direction for CUDA, and no doubt they could add hardware into their cards and features to CUDA that run slower on AMD hardware. You say it sounds like a reasonable offer, but I can completely see why AMD would rather work with an already open standard.

Yeah, a competitor offering you a (likely incomplete) documentation of their technology, which they have already deployed in the field with a bunch of bugs you'll have to reverse engineer and be compatible with, combined with a codebase that's impossible to sanely integrate into your own tooling, is completely unknown to you and has all the engineers most familiar with it on your competitors team. Then both of these things can evolve into directions you have no say over.

This does not make for an "open standard".

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

FaustianQ posted:

Honestly, I'm not sure how it can be marketed as superior. If FreeSync eventually manages 30-144hz, why bother with Gsync? Anything under 30fps feels like a slideshow to me and even below 50fps it irritates my eyes, I don't see the utility of Gysncs lower range.

My MG279Q already has 30-90hz and honestly that's more than enough, and if it works with future Intel stuff I'm happy.

I just had a lengthy discussion about this in the monitor thread. Basically I 100% agree with you that the advantages of G-Sync are of no relevance to the practical world, but some people apparently vehemently disagree and believe they absolutely need ULMB (with its problems) and that the range limits make it terrible.

I suspect Nvidia's marketing is quite effective to those people.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.
Well the point of discussion (in this thread) was whether G-Sync could be advertised as being superior. I believe it's clear we actually agree it could, though for differing reasons.

Nvidia doesn't market it as such *now*. If you read my original post I was talking about a situation where they support both G-Sync and FreeSync, comparing to CUDA vs OpenCL.

Hiowf fucked around with this message at 11:45 on Aug 24, 2015

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Gwaihir posted:

That doesn't excuse hopping from the monitor thread to this thread to lob veiled insults at another poster, who obviously also reads and posts in both places.

The *existence* of that "another poster" is the direct answer to a question asked here: namely how G-Sync could be marketed as superior on a card implementing both technologies. I can disagree on the relative merits of G-Sync vs FreeSync but I can't disagree that people exist that think G-Sync is superior and not just in the theoretical sense.

quote:

Christ, could you fanboy any harder? We get, you think Nvidia is the EVIL EMPIRE of corporate greed and wouldn't do anything at all that benefits a consumer of video cards unless forced to by plucky little AMD and ~THE FREE MARKET~

Seriously, go to the Nvidia site, and compare the information about CUDA against the information about OpenCL. You'll have to find OpenCL first as there's dozen places that reference CUDA but OpenCL isn't even in the menu or anything. Then realize using CUDA will get you locked in to a single vendor.

Pointing out that one vendors' strategy is to promote lock-in does not make one a fanboy of a random other vendor.

Hiowf fucked around with this message at 16:29 on Aug 24, 2015

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Paul MaudDib posted:

AMD's 970 competitor is the 390x.

Isn't the plain R9 390 closer?

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

FraudulentEconomics posted:

I just bought a 1440p ultrawide monitor and was wondering if it makes more sense to get a 970 or 980 ti. Budget isn't much of an issue, but the extra cost would push back the build date of my machine. I don't need BLEEDING EDGE AWESOME gaming, but I know you pay a premium for power.

970 loses in price & perf to the R9 390 so I wouldn't get that. Also look at reviews to see how much FPS those push in resolutions with simlar amounts of pixels. They're really going to struggle, so it makes sense to aim higher.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Twerk from Home posted:

This isn't really true, is it? A stock 390 is slightly slower than a stock 970, but the 970s are overclocking monsters.

You can overclock the 390 too. Whether the 970 gets enough extra overclock to then match it seems to depend on how lucky you get with both cards and which benchmarks you select. The tests for this are all over the place IIRC.

I personally don't overclock anyway.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Truga posted:

I, too, pay >$300 for something and then don't give a poo poo how moving 1 or 2 sliders in windows gives me an extra 20% performance. No sir.

Oh hey an overclocking debate, just what I wanted to get into.

It's only free performance if your time isn't worth anything. Do you overclock your stuff without even testing system stability? Without having specced anything else in your build preemptively higher?

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Truga posted:

And yes, this is by all accounts nvidia's fault. They should have higher clocks out of the box IMHO, .

Don't the board partners decide the factory default overclock? My previous NVidia card shipped with a 16% default overclock and had very little headroom beyond that. The GTX950 boards are also shipping with massive overclocks.

When I say I don't overclock I of course don't mean I'm turning down these boards. If the manufacturer has done their homework and tested the thing for me and guarantees it works I'm fine with that.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Twerk from Home posted:

I've had both a GTX 970 and an aftermarket cooled R9-290 and did some performance comparison of both of them on my own. The 970 is a better GPU, both stock and overclocked. Overclocked, it is a significantly better GPU than the 290 / 390.

Whoa, I don't think you can just equate the R9 290 and R9 390 like that. Most obvious difference is the latter having 20% higher memory bandwidth outright. There's process advances/better binning too. Because of the price hike the R9 290 is still in the picture for price/perf but it's just an inferior card compared to R9 390s.

For thermals/power Nvidia always wins, yes, for pure perf I'm sure most published benchmarks just outright disagree with you.

Hiowf fucked around with this message at 19:46 on Aug 29, 2015

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

BurritoJustice posted:

Clock for clock the 290 and 390 are completely identical. There has been nothing to show that the "process/binning" has changed. A 390 is a 290 with higher stock clocks and 8GB of slightly faster VRAM (that you can run the 290's VRAM at), nothing more.

Silicon that has had process improvements (that allow for better binning) is going to be clock for clock identical. Not sure what else you expected?! The whole point is to up the clockspeed (or reduce the voltage).

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

BurritoJustice posted:

They are built on the exact same process. Anything else was just reddit speculation.

To be fair for example Anandtech.com claimed the same, e.g http://www.anandtech.com/show/9387/amd-radeon-300-series/3

"Last but certainly not least however, we want to talk a bit more about the performance optimizations AMD has been working on for the 390 series... "

Which at the end refers to voltage binning specifically.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Subjunctive posted:

PC ports of console games should tilt in AMD's favour if this is true, which would be quite the development. Console-first houses are likely to be less susceptible to NVIDIA's devrel charms.

There's also the issue that the consoles themselves have AMD GPUs.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

quote:

It will be a sad state of affairs if the phrase "buy a used 290" still exists next year

Used video cards always have great value for the money.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Combat Pretzel posted:

More Async Compute bullshit...

https://www.reddit.com/r/nvidia/comments/3j5e9b/analysis_async_compute_is_it_true_nvidia_cant_do/

tl;dr: NVidia does support it after all, but it has a shorter queue (i.e. less compute units), and if you submit more poo poo than can be queued, it trips up. Their CUs are however way faster than AMDs.

Reading more, it's still far from clear. What is clear is that you should care about actual benchmarks from stuff you play, not people trying to write micro-benchmarks that end up being optimized away or don't even properly hit the paths they pretend to test.

quote:

I think I saw it once in some obscure article, or maybe I dreamed it, but did anyone ever figure out how many ms of delay FreeSync and G-Sync introduce to a system versus just "vsync off"?

If you are hitting the max Hz, G-Sync adds delay up to the Hz interval because at the framerate cap it behaves like Vsync ON from the game's perspective. Some tests have shown extra lag (~20ms) if there's bad interactions with the game engine.

FreeSync is able to behave like Vsync OFF above the max Hz (AFAIK), but you have to take into account you are dependent on the implementation in the monitor and *that* may add delay (typically 1 frame) itself regardless of Hz. So you should check the reviews of the specific monitor.

If you really, really care about input latency then turn either off. If you don't specifically care then leave them on because the worst case delays aren't much anyway.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

BurritoJustice posted:

With GSync you can toggle whether or not to enable VSync when you max the refresh. It is in the Nvidia control panel.

Cool. Most reviews must predate those driver enhancements.

I'm similarly reading some references that FreeSync now switches to the max Hz (and repeats frames - note that that's not the same as frame doubling) if you go below the range, rather than clamping to the lowest Hz. That seems like it'd make more sense as the time difference between the ideal frame time and the one you can get would be lower (i.e. for a 35-90Hz range, you''d have about 2.6x less judder when vsyncing at 90Hz instead of 35Hz when you're rendering <35Hz).

Does anyone know a free demo or benchmark that would allow for testing this, i.e. that could bring a modern card to sub 35fps?

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

LiquidRain posted:

Super sample/DSR a demanding game.

4xSSAA on Crysis did the trick.

Anyway, at least as of 15.8 beta AMD's drivers + FreeSync still seem to do the most stupid possible thing: they lock to the lowest Hz when going outside the range.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

xthetenth posted:

Well it would potentially benefit nv to get VIA's chip designers.

(What I'm saying is that Denver was bad.)

Wut, where does this come from? Denver's performance is great unless you focus on a single micro-benchmark no one (except AnandTech) considers indicative of anything any more. It's one of the fastest non-Apple ARM cores, and remember it has a process node disadvantage compared to those.

Hiowf fucked around with this message at 09:05 on Oct 30, 2015

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

xthetenth posted:

It's got what, one design win including NV's products?

It's in the Nexus 9 (HTC).

AFAIK their future designs were announced to have Denver cores as well, so they still prefer it over licensing ARM's designs.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.
Intel and Nvidia have a cross patent license, but it excludes x86. Long before the current one, they had a similar one that was needed for Nvidia to make Intel chipsets.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

FaustianQ posted:

Even a 390X with 1100Mhz clock soundly beats the RX480, drop the voltage and clocks so the 390X matches the RX480 performance and the efficiency gain on the new node is laughable.

This is a good strategy if only wafers would be free.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Xae posted:

The fact that AMD was able to drop more product to retailers in a week than Nvidia has in a couple of months (?) speaks to something being hosed in Nvidia land.

To be fair the AMD dies are smaller, which gives an exponential payoff in usable chips/wafer.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.
GTX 1080 vs GTX 980

1607 / 1126 = 43% clock speed gain from 28nm -> 16nm

RX 480 vs R9 390

1266 / 1000 / = 26% clock speed gain from 28nm -> 14nm

The architectural improvements in the RX 480 are for the most part are not in the shader core, so there's no shader IPC to win. fp16 support isn't used enough to lose clockspeed over, and I can't remember my shaders running out of instruction cache (but maybe that says more about my shaders). This is supported by the GFLOP number not corresponding to any real world perf increase.

We can't tell if it's AMD's design process, or GloFo/Samsung, but this was never gonna be a winner as soon as the relative clock-speeds were announced.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

xthetenth posted:

The 1070 being a big ol crop chip should help balance that out vs only whole die 480s, no?

Yes, which is why the GTX 1070 is easier to actually get despite being launched later. That doesn't change the GTX 1080 vs RX 480 situation, though.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

THE DOG HOUSE posted:

But during that launch event, the actual launch dates were released which is the only dates that matter

"Launch dates" on which no one was actually able to obtain the card...are not launch dates.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

MrBond posted:

So wait, is the general consensus that the RX480 is a stinker? On the surface it seems reasonable for the price.

It would be fairly OK, if not for the power usage numbers. If those don't bother you, and you can get it for the announced price, it's a great card.

The problem of it just being "OK" is that NVIDIA can throw their excess 970 stock at it with discounts while the RX 480 price remains above MSRP.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.
Just try a MacBook retina next to a regular 1080p laptop and try to read the text on a webpage.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

xthetenth posted:

The reason everyone's super mad is because it's using a ton of watts to do it which means that all the people who put too much money into their computer (most of the thread) are really worried that they won't be able to fit much performance into 300W for a top end card.

It's bad because no room upwards for vendor A means no price pressure for vendor B, which leads to even more "too much money into their computer". Raaah.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Knifegrab posted:

So I really only follow Graphics card stuff casually, with everything dropping about the new AMD card can someone give me the low down? Is it better than Nvidia's offering?

It's a tiny bit better than a GTX 970, at similar prices.


That makes us sad.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

wicka posted:

buddy idk how your doom and gloom brand fits withing agreeing that it's a really good card for the price

It's the only card on a new process at this price-point. The concern is that the competition can demolish it.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Barry posted:

Just curious, why are you so interested in system wattage? Do you have prohibitively high energy prices or is it just kind of a hobby type thing?

Watts cause heat, which requires cooling, which causes noise.

I have a MSI R9 390 and I can tell you that despite this being one of the most quiet coolers, it gets pretty drat audible when running at full blast.

(In reality, I now have enough of these things in my office that the heat output itself is becoming a problem)

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

axeil posted:

So is the 480 an upgrade over a 380? I feel sort of like an idiot for getting a 380 last year since it's generally been a :geno: kind of card.

Getting Freesync would be nice, since it seems like the better implementation.

RX 480 performance is about that of an R9 390, so it's definitely an upgrade, also in power consumption.

In Germany the 4G model is now in stock for 219 EUR, which sounds excellent.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

axeil posted:

Oh wow. I have the lovely 2 GB 380 so according to this it looks like I'll gain around 30% in terms of performance at 1080p.

100 / 63 means a 59% performance gain.

quote:

Are the heat/wattage issues really a big deal

No, the card is fine. Just not where it was expected to be.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

axeil posted:

Although given my 380 experience I'm now super hesitant about pulling the trigger on any new card. Maybe I should wait another year? I have no idea any more.

We're not likely to see a bigger leap than we see now due to the process shrink, though.

Adbot
ADBOT LOVES YOU

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

TroubledWaters posted:

Is this one of the best 1070 choices out there? There are so many options and I feel really silly that I can't tell the difference between one and the other.

Clocks are good but IIRC when they tested the 1080 version of that card the cooler was rather noisy.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply