Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
eames
May 9, 2009

Factory Factory posted:

Another tidbit:

Intel demo'd the Haswell GT3e IGP. It's roughly GeForce 650M-level. :circlefap:

And then next up there’s Broadwell which will apparently bring another 40% IGP improvement on top of Haswell.

Suddenly articles like this don’t look so stupid anymore.

Why is Intel suddenly so successful in the IGP department after trying (and failing) so hard for so many years? :confused:

Adbot
ADBOT LOVES YOU

eames
May 9, 2009

I haven’t followed the GPU scene for a while, but wow, the 290X seems unbelievable value compared to the Titan. The gap almost seems to be too large to be true. :stare:

hardocp review posted:

And consider this, for just $100 above the price of one GTX TITAN, you can own two Radeon R9 290X video cards and use those in a CrossFire configuration.

eames
May 9, 2009

Agreed posted:

Here's an article from TweakTown focusing on overclocking and it's about what I expected - low overclocking headroom on the R9 290X (we'll see if that holds or not once non-reference models start showing up and tools allow for more adjustment)

From what I read about the new Powertune in the HardOCP review, overclocking this card with the regular reference cooler is completely pointless.
The card dissipates up to 300W (three hundred watts) of heat and dynamically adjusts the frequencies to meet a preset power/temperature envelope, which means it actually clocks down to 800 Mhz while running moderately intensive games to stay below 95°C. I wonder how far the GPU would go with water-cooling, probably very high.

On a different note, some german sites suggest that the GPU is already somewhat bandwidth-limited at the default 1 Ghz. Or the architecture just scales well with memory bandwidth.

http://www.computerbase.de/artikel/grafikkarten/2013/amd-radeon-r9-290x-im-test/15/

https://www.youtube.com/watch?v=PFZ39nQ_k90

eames fucked around with this message at 19:45 on Oct 24, 2013

eames
May 9, 2009

The Lord Bude posted:

I haven't seen anyone mention this yet, and I'm not sure if I should stick it here or in the monitor thread, but it looks like G-sync is going to be exclusive to Asus monitors until the end of 2014:

http://www.pcauthority.com.au/News/362115,nvidias-g-sync-only-with-asus-for-now.aspx

Colour me entirely disinterested. Nvidia can prise my 1440p PLS panel out of my cold dead hands.

I was about to post that earlier but then I realized that the source of this rumor is this super shady WCCF tech site. The author keeps writing sensationalist tweets and makes up silly rumors like this to generate pageviews on this website.

https://mobile.twitter.com/usmanpirzada

So yeah, that rumor is BS.

eames fucked around with this message at 09:53 on Oct 29, 2013

eames
May 9, 2009


This site is literally the worst. Every time I see a ridiculous rumor pop up with only one shady source, it’s that site. See G-Sync being Asus-exclusive. Do not feed the trolls with advertising money.

eames
May 9, 2009

Josh Lyman posted:

LOL at people saying the 1070 is only good enough for 1080p

I received a laptop with a 1070 and a 75hz 1080p screen yesterday. It's pretty pointless because all the games (mostly CPU-bound blizzard games) I play now hover well over 130 fps with all settings maxed and even SSAA in some cases. Should have waited for better (1440p or 144hz) screens or bought a model with the 1060 instead. :doh:

eames
May 9, 2009

Truga posted:

Can't you plug in a gsync screen and play on that? Or does that not work that way?

Also, any news on these new laptops supporting VR?

I could but buying an external monitor would require a keyboard and extra space on my desk and I bought a laptop to get around that.

VR should probably work fine, the Steam VR test gives me a score of 11.
I don't have anything to compare it to but it maxed out the scale at "Very High" with 0 frames below 90fps.

Perhaps I'll buy a Vive in the future to justify the GPU. :vrfrog:

eames
May 9, 2009

The Lord Bude posted:

oh. Oh dear. So what do mac ports of games use? metal? My dream of owning a mac and continuing to play video games is in tatters, although I suppose I could boot into windows.

To be honest I can't think of many recent Mac ports. As mentioned above Macs aren't great for gaming unless you build a hackintosh yourself, at which point you may as well dual boot into Windows for the newest games.

My 15" rMBP throttles its CPU from 2.3 to 1.3-1.4 Ghz when the dGPU is running at full load for longer than 5 minutes. :eng99:

That's what happens when you put a 45W TDP CPU and a 50W TDP GPU into a relatively slim case and run the machine off a 85W power supply.

They're a bit like like Tesla cars, super impressive for short bursts of speed and the average user doesn't ever use more than that, but it all falls apart as soon as you demand maximum power for longer than a few seconds. :iiaca:

The next rMBP will probably be even thinner and thus even worse for sustained load.

eames
May 9, 2009

The Lord Bude posted:

To be fair I was talking about a 27" imac not a macbook, but I take your point. I was under the impression most AAA games get Mac ports these days. It would still be nice to use OSX most of the time and dual boot into windows for gaming. It's mostly just wishful thinking , given that Nvidia is now putting out slightly underclocked versions of desktop cards for laptops. Or I suppose an external GPU like that thing razer has happening - is that only being supported on that one laptop of theirs?

My current PC isn't going anywhere for the next couple of years in any case.

The iMacs are of course better for gaming but their GPUs are still abysmal compared to the native screen resolutions and what's available on the market. A new iMac with a mobile GTX1080 is technically possible and would play most games at half resolution with decent frame rates. All we can do is wait and hope. :shobon:

External GPUs already work with Macbooks. Check the techinferno forums for more info.

Compared to an internal card you can expect ~25% lower framerates using an external monitor and 40% lower frame rates using the internal laptop monitor because of TB protocol overhead. I posted about this here.

The Razer core housing costs $500 and a GTX1070 costs $450, so you're probably better off building a separate gaming PC for the same price. You may even be able to buy stripped down GTX1060 laptops around that price in the future.

eames
May 9, 2009

FaustianQ posted:

I'm wondering now if it's not just a complete 28nm product, basically Polaris XT on 28nm? That's a ridiculous die size but it might explain the TDP?

If that is true (and I think it is) then Sony was probably caught with their pants down when Microsoft announced their next Xbox. The PS4 Pro looks rather lacklustre in my eyes. It seems like they primarily built it for the PSVR and they threw in a questionable non-native solution for 4K as an afterthought to keep the marketing department happy. I suspect we'll see a PS4 Pro slim shortly after the new Xbox is out.

310W... :crossarms:

eames
May 9, 2009

SwissArmyDruid posted:

....Wow, gently caress you, GloFo.

http://wccftech.com/amd-polaris-revisions-performance-per-watt/


I am becoming more and more irrationally mad at GloFo and Hector Ruiz the more news comes out. 4X5 cards on the horizon, I guess. =T

Also, not liking that Asus's 1050 Ti is using a full-length dual-fan cooler.

Oh boy, I wonder what this means for the new Macs which are rumoured to come with Polaris/Vega GPUs and should be announced within the next two weeks.

Are they going to ship with the "old" GPUs or the newer GPUs? Are we going to see a silent update if it comes with the older one?
It's >30% more perf/watt after all.
Perhaps another piece in the puzzle of why the whole Apple mac lineup is so hopelessly outdated.

eames
May 9, 2009

"Oh no guys the Nvidia driver crashed again." :downsrim:


Between the Tesla news, Nintendo rebranding the Shield 2 as their own console, Intel's iGPU development hitting a wall and AR/VR perhaps becoming a thing, now is probably a pretty good time to own some $NVDA.

eames
May 9, 2009

SwissArmyDruid posted:

Ho poo poo, I thought the 455 thing was scuttlebutt, and not that there was an actual product.


Well, here's Tech Reports's best guesses:



http://techreport.com/news/30881/radeon-pro-specs-hint-at-a-full-fat-polaris-11-gpu-in-macbook-pros

:ughh:

If that's true the base 450 really won't be noticeably faster than the iGPU? I assume it was cheaper for Apple to drop that trash chip in than design a separate iGPU only 15" model with custom PCB layout, heatsinks, etc. Particularly because it would have been a dead end because GT4 KL is nowhere in sight.

The 460 is ok I guess, assuming 35W it is at least in the same TFLOPS/watt ballpark as a GTX1060.

eames
May 9, 2009

This new AMD site confirms the Polaris 11 specs.

1/1.3/1.86 TFLOPS.

http://creators.radeon.com/radeon-pro/

quote:

The Radeon Pro Graphics processor found on the MacBook Pro is thinner than a US penny with a Z-height of only 1.5 mm.

Aww yes, package height is the first thing I'm looking for when I go out to shop for a GPU. :downsbravo:

Twerk from Home posted:

The base 450 is going to be a lot faster than the iGPU, because the new 15" has a weaker iGPU than the new 13". They didn't spring for the big Iris GPU in the 15" guy this time around, way to go Apple.

The point still stands, Skylake GT4 iGPU would have been on par or faster than the 450.
Rolling out their laptops with Iris Pro would have forced them to a.) design a 15" iGPU-only SKU or b.) sell a laptop with a dGPU that's slower than the iGPU as Polaris yields are probably not good enough to put 455/460s into every MBP.
What a dumpster fire.

eames fucked around with this message at 08:40 on Oct 28, 2016

eames
May 9, 2009

That power consumption decrease seemed impressive until I read the slides.

slides posted:

In World of Warcraft on an 8GB Radeon RX 480, Radeon Chill did the following:

Decreased average GPU power consumption by 31%, from 108W to 75W
Decreased average GPU temperature by 13° Celsius
Decreased 99th percentile display time by 51%
Audibly lowered fan noise and speeds

and then in a sentence two paragrpahs down:

quote:

"Here’s one example taken from World of Warcraft. In this case, the FPS average is measurably higher with Radeon Chill turned off—125 FPS versus 62 FPS with Radeon Chill enabled."

:saddowns:
They're basically throttling framerates depending on movement on the screen. I guess this will be useful for laptops but I don't think it'll do much if anything with a RX 480 and a 100hz 1440p screen.

eames
May 9, 2009

SwissArmyDruid posted:

This feels like an architecture which is built for compute and machine learning first, which is then being used to render graphics.

A smart move, in my opinion.

That explains why Elon Musk stated that it was "a pretty tight call" between Nvidia and AMD GPUs for the Tesla AI applications. Interesting.

eames
May 9, 2009

repiv posted:

That's what I would have assumed, until I saw the official AMD video where they describe the card shown slightly outperforming the GTX1080 as "the flagship Vega" :(

e: vv If the ROCm slide leak is legit then 14nm Vega still has negligible (1/16th) double precision performance.



Interesting that this lists 16GB HBM2 which is at odds with the youtube armchair analysis linked above.

Those power targets though, guess we won't see Vega APUs in notebooks anytime soon :stare:

eames
May 9, 2009

SwissArmyDruid posted:

ROCm cards are explicitly compute cards. They are AMD's equivalent to Nvidia Tesla cards. (Not to be confused with the products that Nvidia is making FOR Tesla. These are the cards that have no graphical outputs whatsoever)

Oh I see. I thought ROCm is a different codename for consumer Vega. :doh:
So those are pretty much what AMD presented to Tesla Motors as an alternative to Drive PX2.

eames
May 9, 2009

I'd expect HBM to have far lower latency though, surely that has to affect performance as well.

As for AMD stock, I think there is a good chance that Ryzen/Vega will be a disappointment compared to Kaby Lake/Pascal. The APUs should be very strong though, I'd say it is almost certain that AMD has the next console generation locked down. Musk said that the decision between NVIDIA and AMD for the new AP hardware was very close, so they are must be doing something right there as well.
Add the fact that Intel and NVIDIA want AMD to stick around for antitrust reasons and I'd say things are looking pretty decent. I don't own AMD stock but I wish I did.

eames
May 9, 2009

I'm not sure how accurate the "Interface Load" HW monitoring is but I'm seeing 5-6% bus load streaming games at 2880x1800.
That's 40% more pixels than 1440p using a 1060 and PCI-e 3.0 x16.
The load is even lower (2%) with software encoding because the CPU can't keep up.

eames
May 9, 2009

Intel CPUs à la Crystalwell but with 8GB HBM2 L3 cache? :stare:

eames
May 9, 2009

I am trying to play Rise of the Tomb Raider at 2880x1800 (roughly equivalent to UW1440p) with the High Preset except for Textures set to Very High on my 1060-6GB. It looks great.

Framerates (~40 FPS) are acceptable to me because I'm using it for steam in home streaming. The only problem is that the game keeps crashing with a message that it ran out of RAM. My OSD sometimes shows frame buffer usage up to 5.8GB.
Is 6GB really not enough at this resolution?

eames
May 9, 2009

craig588 posted:

Are you using DX11 or DX12 mode? When I played it back on launch DX12 was notably less stable for me and didn't appear to have any dramatic image quality improvements.

DX11 for the same reasons you mentioned. DX12 also seems to double the steam streaming latency.

EdEddnEddy posted:

RAM is your local memory so either it has some sort of memory leak and its eating up your full Ram amount + swap file, or your swap file is off and you still don't have enough ram for all it's trying to load and use at the time.
Might be worth googling as threads like these talk about how using Very High textures most definitly goes over the 6G VRAM limit and may be eating into your Physical Ram memory as well causing your issue.

What are you running for RAM again? And is your swap file on?

The only two games I have had issues like this before were Sid Meier's Railroads and Crysis Warhead running the Living Legends Mechwarrior mod.. You could watch the memory usage increase nonstop until it crashed. :(

Swap file is on & automatic, it's running in a VM and I had 8GB assigned to it. I bumped it up to 12GB RAM and the crashes stopped. Looks like the "Very High" texture setting really does use more than the 6GB VRAM of my card and starts swapping into RAM (failing that perhaps even swapping textures onto the SSD). I've set textures to "high" now and things have settled at ~4GB VRAM usage.

Note to self: Buy 8GB+ card next generation

eames
May 9, 2009

Phosphine posted:

Is there some cool reason you're running games in a VM?

Not really, I'm just using my always on headless linux NAS to stream games to my rMBP at full resolution.
It only cost me the price of the GPU (~$300) instead of the ~$1500 for a 1060 based notebook, plus the laptop stays cool when gaming and I prefer macOS without rebooting.

eames
May 9, 2009

I believe some cards have the zero-rpm fan mode disabled because parts of the board (vrm/ram/other components) would heat up too much even when the GPU is idle

eames
May 9, 2009

videocardz has a bit more info on vega and a rumored polaris refresh/rebrand in May but nothing surprising.

TLDR:

RX470 -> RX570 ~between GTX1050ti and 1060-3GB
RX480 -> RX580 ~GTX1060
Vega11 -> RX590 ~GTX1070
Vega10 -> Fury something ~GTX1080/ti $599-699, 8+6pin power connector (85+75+75=235W)

edit:

oh look a leaked benchmark :rolleyes:

eames fucked around with this message at 13:28 on Feb 22, 2017

eames
May 9, 2009

Surely a Pascal refresh would be a much more likely response than early Volta?
A few percent more cores here, a bit higher clockspeed there and perhaps even a bump in memory bus width... job done?

eames
May 9, 2009

titanium posted:

The self boosting really confuses me, so what's the point of setting +200 if it's going to hit 2000mhz even if its only set to +39?

Thanks for the replies btw sorry if this is redundant poo poo.

Pascal automatically adjusts core frequency up to a certain maximum temperature, maximum voltage or maximum power consumption cap .

Setting an offset of +200 means that it clocks higher relative to the voltage. i.e. if it would normally "boost" to 1800 Mhz at 0.950V and 1836 at 0.975V, it'll now boost to 2000 Mhz at 0.950V and 2036 Mhz at 0.975V, obviously at the potential cost of stability. (those are random numbers I just made up)

Since lower voltage results in lower power consumption, the card will reach higher clockspeeds until it either reaches the maximum voltage or the total cap of 180W.

eames
May 9, 2009

I use Precision X mainly because my card is from EVGA and I haven't tried anything else.
Things that continue working after the app is closed:
adjusted temp/power target
raised voltage limit
frequency offsets and custom frequency curves

doesn't work after the app is closed:
custom fan speed curve (seems to either get stuck at fixed percentage or reset to the default bios curve, the former can be kind of hairy although the you will throttle before anything terrible happens)

eames
May 9, 2009

FWIW RX480s are on sale with significant price cuts, so perhaps we'll really see a Polaris refresh soon.

The "MSI RX480 8GB Armor OC" is on sale for $172 after promo & rebate, reference cards for $180.

If only AMD would support steam streaming hardware acceleration, I paid 290€ for my 1060-6GB only two weeks ago. :argh:

eames
May 9, 2009

Heise.de claims the 570 and 580 are coming on April 4th, 560 and 570 a week later. All of them are Polaris refreshes/rebrands.
Vega around Computex in May/June.
They're usually very reliable and don't publish rumours unless they have an inside source.

https://www.heise.de/newsticker/mel...il-3637882.html

The leaked RX 580 benchmarks suggest 1070 performance, if they can delvier that around $300 it'll be quite solid. We'll find out more in a few hours.

eames
May 9, 2009

There are some rumours about a 1060 Ti making the rounds now but nothing substantial.
If true that would make a Pascal refresh rather unlikely; Nvidia probably feels like they can deal with Polaris refresh with one new card between the 1060 and 1070 plus the odd price drop here and there.

I think it's going to be 1080 Ti launch now, RX5x0 Polaris refresh in April, Vega with 1080 performance in June and then nothing until Volta in Q1 2018.

Vega doesn't look like it will be competitive at the top end but the Zen + Vega APUs should be very, very interesting.

eames
May 9, 2009

stream link, should be embedded!

https://www.youtube.com/watch?v=ZVKDNeyfpAo

eames
May 9, 2009

Vega damage control mode engaged

eames
May 9, 2009

guy posted:

"by the time we are done developing this VR tech demo we'll need two vegas, not one vega to process it"


:ughh:

eames
May 9, 2009

I'm not sure if livestreaming this was a good idea but it's quite entertaining to watch

eames
May 9, 2009

TLDR: top end product will be a 350W dual gpu card. :rip:

eames
May 9, 2009

Fauxtool posted:

So does that mean their top end product is still falling well short of the 980ti made almost 2 years ago?

My expectations at this point are Vega = GTX 1080, dual GPU Vega with two downclocked GPUs = 160% GTX 1080 (in some titles)

eames
May 9, 2009

Fauxtool posted:

Is there any data to reinforce that expectation? I find it highly suspect they benchmark on ashes and doom so much instead of the games people are actually playing

no those are just numbers I pulled out of my rear. also history repeating itself :v:

Adbot
ADBOT LOVES YOU

eames
May 9, 2009

this VR title looks like it was developed for the iPhone 3GS

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply