Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



PerrineClostermann posted:

Vega is a concept, an ideal we strive towards. There is no Vega, because Vega represents our yearnings and struggles toward the future. Vega is pure, and will forever be beyond reach.

Then tell me why there's 18 hours of static?!?!?!

Adbot
ADBOT LOVES YOU

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Wistful of Dollars posted:

Is AMD managing to go from CPU = BAD and GPU = GOOD to vice versa in one generation?

I was just thinking that myself.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

New Zealand can eat me posted:

Really though, is there any reason to actually be concerned about them using two Vega cards for the Prey demo? Everyone seems to have forgotten the "drivers still aren't ready and they're just emulating fiji" defense that was so popular when those janky numbers leaked

It reflects badly on Vega to even use 2 cards and no visible performance data. It's two months, I highly doubt drivers aren't "ready" at this point. The only reasonable defense I've heard is that the demo wasn't about Vega at all, but rather the X399 platform. Constant pressure to show Vega is what resulted in them doing that at all.

I really don't care about big Vega anyway, still hoping for small Vega to replace the aging 290X I have for 1440p Freesync.

Delusibeta
Aug 7, 2013

Let's ride together.

New Zealand can eat me posted:

Really though, is there any reason to actually be concerned about them using two Vega cards for the Prey demo? Everyone seems to have forgotten the "drivers still aren't ready and they're just emulating fiji" defense that was so popular when those janky numbers leaked

The official argument is that they were demonstrating how Threadripper could handle two of the "latest and greatest graphics cards for content creation" (i.e. Vega). In the process, they made said graphics cards look like 1070s.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

SourKraut posted:

Then tell me why there's 18 hours of static?!?!?!

It represents the fog and confusion of life, the aimlessness of our meanderings as we search for the way to Vega

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Delusibeta posted:

The official argument is that they were demonstrating how Threadripper could handle two of the "latest and greatest graphics cards for content creation" (i.e. Vega). In the process, they made said graphics cards look like 1070s.

While that's obviously damage control, I think there may be a nugget of truth there.

Workstation processors are not good at gaming. Even thread-friendly games still have a single primary thread that bottlenecks first, and throwing more cores at it doesn't help much past a certain point. In fact the extra threading overhead can actually hurt (especially if it puts more overhead onto the primary thread). Given how Ryzen is already noted for suffering huge issues while gaming with its funky interconnect things may be even worse if threads are bouncing all over the place across four CCXs on two separate dies.

Threadripper is not a gaming chip and I think people who just want a high-end gaming chip are going to be disappointed. It's not really a competitor to the Intel HEDT lineup where you have 12 cores on a single die. It's a workstation chip and clocks are going to be lower (and they may have locked multipliers, who knows).

Maybe it's just super CPU bottlenecked on a single thread. Can using a lovely CPU make your 1080 Ti perform like a 1070 would? Yes.

But who knows given that they didn't show any Rivatuner data?

That's about the most positive interpretation I can give there. Maybe someone totally hosed up conceptually and used the wrong hardware to build their demo rig, like they thought they'd throw a bone to power-gamers or whatever. There is no possible interpretation where this wasn't an absolutely insanely idiotic demo to give.

Also, the idea that you somehow need to prove that your CPU can do CrossFire is absolutely ludicrous. You've been able to do Crossfire on consumer CPUs for loving ages now, and running at PCIe 3.0x8 speed has almost no impact on anything except really intensive compute.

Paul MaudDib fucked around with this message at 20:39 on May 31, 2017

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

FaustianQ posted:

It's two months, I highly doubt drivers aren't "ready" at this point.

I was getting drops of drivers with "critical fixes" (including follow-up email to make sure we'd deployed it to all the test/demo machines) 3 weeks before Fiji street date.

They also likely locked the demo config a fair while before the demo date. Nobody gets to check in anything meaningful to the demo branch the week before Lisa goes on stage.

Subjunctive fucked around with this message at 20:51 on May 31, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Subjunctive posted:

I was getting drops of drivers with "critical fixes" (including follow-up email to make sure we'd deployed it to all the test/demo machines) 3 weeks before Fiji street date.

They also likely locked the demo config a fair while before the demo date. Nobody gets to check in anything meaningful to the demo branch the week before Lisa goes on stage.

So drivers used in demo like this aren't even likely good, rather "stable".

Paul MaudDib posted:

While that's obviously damage control, I think there may be a nugget of truth there.

Workstation processors are not good at gaming. Even thread-friendly games still have a single primary thread that bottlenecks first, and throwing more cores at it doesn't help much past a certain point. In fact the extra threading overhead can actually hurt (especially if it puts more overhead onto the primary thread). Given how Ryzen is already noted for suffering huge issues while gaming with its funky interconnect things may be even worse if threads are bouncing all over the place across four CCXs on two separate dies.

Threadripper is not a gaming chip and I think people who just want a high-end gaming chip are going to be disappointed. It's not really a competitor to the Intel HEDT lineup where you have 12 cores on a single die. It's a workstation chip and clocks are going to be lower (and they may have locked multipliers, who knows).

Maybe it's just super CPU bottlenecked on a single thread. Can using a lovely CPU make your 1080 Ti perform like a 1070 would? Yes.

But who knows given that they didn't show any Rivatuner data?

That's about the most positive interpretation I can give there. Maybe someone totally hosed up conceptually and used the wrong hardware to build their demo rig, like they thought they'd throw a bone to power-gamers or whatever. There is no possible interpretation where this wasn't an absolutely insanely idiotic demo to give.

Also, the idea that you somehow need to prove that your CPU can do CrossFire is absolutely ludicrous. You've been able to do Crossfire on consumer CPUs for loving ages now, and running at PCIe 3.0x8 speed has almost no impact on anything except really intensive compute.

It's two months until we really know but early August is gonna be one hell of a time. I'm deinvesting myself as much as possible to enjoy whatever kind of fireworks happen.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

FaustianQ posted:

So drivers used in demo like this aren't even likely good, rather "stable".

Well, they believe they show off the product well enough to put the CEO in front of it, so that tells you something.

They're very likely not the drivers that will be available on day 1 of the glorious Vega future. Whether they are now ready, or will be then, or will be ever, is more philosophical than technical.

eames
May 9, 2009

AMD GrapeJuice™ technology

Delusibeta
Aug 7, 2013

Let's ride together.

Paul MaudDib posted:

Maybe it's just super CPU bottlenecked on a single thread. Can using a lovely CPU make your 1080 Ti perform like a 1070 would? Yes.

The problem with this argument is that the demo was in 4k, so either Threadripper has Bulldozer-grade single threaded performance, or the graphics card was the bottleneck. Frankly, I'm putting all my chips on the latter being the case, I don't think AMD will downgrade the single thread performance of the HEDC processors compared to their mainstream chips to such an extent that there was a CPU bottleneck in that demo.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Maxwell Adams
Oct 21, 2000

T E E F S
I want to nickname it Zendripper.

SwissArmyDruid
Feb 14, 2014

by sebmojo
http://techreport.com/news/32010/qnap-ts-x77-nas-units-virtualize-with-ryzen

Well, this is interesting. QNAP NASes with Ryzen parts in them.

----------------------------------------------------------------------------------------------

Can I just say how happy I am that DVI connectors on motherboards seems to finally be dead? The X370 Gaming ITX/ac that ASRock are showing off at Computex, btw.



----------------------------------------------------------------------------------------------

Pricing out the two small boxes instead of the doubleheaded monster; the old Phenom II that the two munchkins were using croaked somewhat inconveniently the other night, and I figured it's best to just move on wholesale.

People who have jumped on already: What's best practice for Ryzen, one stick of X GB RAM, or two sticks of (1/2)X GB?

SwissArmyDruid fucked around with this message at 12:03 on Jun 1, 2017

Wistful of Dollars
Aug 25, 2009

SwissArmyDruid posted:

http://techreport.com/news/32010/qnap-ts-x77-nas-units-virtualize-with-ryzen


People who have jumped on already: What's best practice for Ryzen, one stick of X GB RAM, or two sticks of (1/2)X GB?

You know, that question never crossed my mind. I don't own one yet, but I'm so used to just slapping two sticks in and calling it a day.

Very curious to see what the answer is.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.
two sticks means double the bandwidth

redeyes
Sep 14, 2002

by Fluffdaddy

quote:

Threadripper is not a gaming chip and I think people who just want a high-end gaming chip are going to be disappointed. It's not really a competitor to the Intel HEDT lineup where you have 12 cores on a single die. It's a workstation chip and clocks are going to be lower (and they may have locked multipliers, who knows).

Are you slightly crazy? Any of these high end chips are great for gaming. Just because you get 3 or 3.5Ghz per core does not mean games are going to suddenly suck.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Suddenly many core chips are uncool if they're from AMD, but an object of lust if they're from Intel, and the inter-core latency of AMD's solution is given an enormous amount of weight even though it's found to be a minor issue at worst in a handful of situations, and in others it's simply not a problem.

These new AMD chips are offering ECC RAM, a ton of cores and tons of PCIe lanes, no doubt at a price Intel won't want to compete with. What the hell is there to really complain about?

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

redeyes posted:

Are you slightly crazy? Any of these high end chips are great for gaming. Just because you get 3 or 3.5Ghz per core does not mean games are going to suddenly suck.

I think he means from a value prop. If your primary use is gaming (non streaming) then you will likely get a lot more value for your money elsewhere, right now the vast majority of games do not use 4 cores (and even the ones that do are usually still single-core dominant). That will eventually change, though dunno if that is 2-4 years from now or 5-8.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
Paul simply has a hateboner for AMD

redeyes
Sep 14, 2002

by Fluffdaddy

Lockback posted:

I think he means from a value prop. If your primary use is gaming (non streaming) then you will likely get a lot more value for your money elsewhere, right now the vast majority of games do not use 4 cores (and even the ones that do are usually still single-core dominant). That will eventually change, though dunno if that is 2-4 years from now or 5-8.

I guess... seems like AMD is a decent value what with 4 cores / 8 threads for less than a 7700k. I get it isn't quite max FPS in all scenarios but isn't it plenty good enough for most gamers?

quote:

These new AMD chips are offering ECC RAM, a ton of cores and tons of PCIe lanes, no doubt at a price Intel won't want to compete with. What the hell is there to really complain about?

My workstation is a 7700k and I would like more cores for doing workstation things. The only thing holding me back is the immaturity of the platform.

Truga
May 4, 2014
Lipstick Apathy
Min FPS is better on ryzen for whatever reason, and min FPS dips are much more noticeable than average fps, so I'd take a 4 core ryzen over a 4 core intel any time of day. The price is just bonus on top of that.

repiv
Aug 13, 2009

Truga posted:

Min FPS is better on ryzen for whatever reason, and min FPS dips are much more noticeable than average fps, so I'd take a 4 core ryzen over a 4 core intel any time of day. The price is just bonus on top of that.

Wait, 4-core Ryzen has better min-fps than 4-core Intel? I thought that was just a perk of the 6/8-core Ryzen models.

redeyes
Sep 14, 2002

by Fluffdaddy

repiv posted:

Wait, 4-core Ryzen has better min-fps than 4-core Intel? I thought that was just a perk of the 6/8-core Ryzen models.

I think the 4 core Ryzen vs 2 Core intel... could be wrong.

Truga
May 4, 2014
Lipstick Apathy
I thought it was due to architectual differences, as I don't really see how extra cores would magically fix random stutters in main game render thread, but I could be wrong.


e: Yeah sorry, googled it and it's actually not higher min fps than intels. I couldn't find any frametime graphs quickly though, I'll try to find some when I have more time.

Truga fucked around with this message at 16:08 on Jun 1, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

redeyes posted:

Are you slightly crazy? Any of these high end chips are great for gaming. Just because you get 3 or 3.5Ghz per core does not mean games are going to suddenly suck.

3 GHz clock rates will actually have a very detrimental impact on game performance. Take a Ryzen 7 and clock it down by 33%, what do you think happens?

Lockback posted:

I think he means from a value prop. If your primary use is gaming (non streaming) then you will likely get a lot more value for your money elsewhere, right now the vast majority of games do not use 4 cores (and even the ones that do are usually still single-core dominant). That will eventually change, though dunno if that is 2-4 years from now or 5-8.

I mean in an absolute sense too. Gaming is typically bottlenecked on a single thread - even in the world of DX12 there is still a single primary thread that takes a disproportionate chunk of the load. You need to clock high to get that thread running fast enough. Workstation chips are going to have lower turbo clocks, and if you attempt to spread out over all your cores you're going to push the turbo clocks even lower.

Every thread you add increases the synchronization overhead. At first, the gains from moving work off the primary/bottlenecked thread are going to be worth it. At some point the overhead adds up and you're not really getting anything. That point is certainly at less than 32 threads. Like, right now it appears to be 6-8 threads for most games based on experimentation someone did on Ryzen 7 a few months ago.

HalloKitty posted:

Suddenly many core chips are uncool if they're from AMD, but an object of lust if they're from Intel, and the inter-core latency of AMD's solution is given an enormous amount of weight even though it's found to be a minor issue at worst in a handful of situations, and in others it's simply not a problem.

These new AMD chips are offering ECC RAM, a ton of cores and tons of PCIe lanes, no doubt at a price Intel won't want to compete with. What the hell is there to really complain about?

:lol: gaming doesn't need ECC, why would you even suggest that?

Intel's chips clock a lot higher, and Skylake-X is probably going to have some decent IPC gains. Those are tangible advantages for gaming.

We don't really know how latency is going to work, but we can certainly look at how it already works for existing multi-socket systems. Hint: badly. You see no gain from the second socket, and in many cases performance gains are negative due to threads bouncing around between sockets. Threadripper is basically multi-socket-on-a-chip and it'll probably behave much the same way.

Again: Threadripper is going to be nice as a workstation chip, but it's not a gaming chip. Ryzen 7 is already HEDT class, Threadripper is basically a Xeon. You don't buy Xeons for gaming. You certainly don't build a multi-socket Xeon system for gaming. Somehow it's a hateboner to point that out?

Paul MaudDib fucked around with this message at 16:20 on Jun 1, 2017

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Paul MaudDib posted:

:lol: gaming doesn't need ECC, why would you even suggest that?

I'm not talking in relation to gaming at all, I'm just saying it's a nice bonus for other uses

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

HalloKitty posted:

I'm not talking in relation to gaming at all, I'm just saying it's a nice bonus for other uses

Then I think you didn't read the quote you were responding to. Here's the excerpt that redeyes found fault with:

quote:

Threadripper is not a gaming chip and I think people who just want a high-end gaming chip are going to be disappointed. It's not really a competitor to the Intel HEDT lineup where you have 12 cores on a single die. It's a workstation chip and clocks are going to be lower (and they may have locked multipliers, who knows).

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



HalloKitty posted:

I'm not talking in relation to gaming at all, I'm just saying it's a nice bonus for other uses

People like to move the goalposts, tis all.

Drakhoran
Oct 21, 2012

SwissArmyDruid posted:


Can I just say how happy I am that DVI connectors on motherboards seems to finally be dead?

Only in high end motherboads. Here is the Asus Prime B350M-A:




Notice that the video out options are:

1x HDMI
1x DVI-D
1x D-Sub.


That is not dead which can eternal lie,
And with strange aeons even death may die.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Paul MaudDib posted:

Then I think you didn't read the quote you were responding to. Here's the excerpt that redeyes found fault with:

OK, I guess I was just making a slightly disjointed comment then, never mind

repiv
Aug 13, 2009

Truga posted:

I thought it was due to architectual differences, as I don't really see how extra cores would magically fix random stutters in main game render thread, but I could be wrong.

If those stutters are caused by background processes being scheduled onto the same core as the main game thread, simply throwing more cores at the problem will help by giving the scheduler somewhere else to put those background tasks.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

If those stutters are caused by background processes being scheduled onto the same core as the main game thread, simply throwing more cores at the problem will help by giving the scheduler somewhere else to put those background tasks.

Take an 8-core Ryzen 7, put Windows into high-performance mode to force it to clock up, then look at your CPU utilization percent while you're at your desktop. That's how much load you'll see during a game.

Like, how high do you think it is? 5% constant load? Probably less I'd think. The reality is that Chrome now throttles background tasks, etc. Unless you've got antivirus running or something, the rest of the system just doesn't eat that many cycles.

Having 16 threads available is already plenty to schedule minor background tasks on. The impact from going to 16 to 32 threads available for that is going to be nil, and the drop in clockrates (again, comparing a 4 GHz Ryzen 7 vs a 3 GHz threadripper) is going to negatively affect game performance.

Ryzen 7 is already leaning way far to the "lots of cores" side of things. Going further is not going to help game performance.

Paul MaudDib fucked around with this message at 16:28 on Jun 1, 2017

repiv
Aug 13, 2009

Paul MaudDib posted:

Having 16 threads available is already plenty to schedule minor background tasks on.

That was literally my point. We were talking about mainstream Ryzen, not Threadripper.

repiv fucked around with this message at 17:18 on Jun 1, 2017

redeyes
Sep 14, 2002

by Fluffdaddy

Paul MaudDib posted:

Then I think you didn't read the quote you were responding to. Here's the excerpt that redeyes found fault with:

It's not that I find fault with the statement, more it is hyperbolic. I just rarely see the mythical dedicated single use gaming PC. I'm sure it exists but in my world PCs are used for many other things. Nerds like cores/threads and gamers like mhz?

Are there even any benchmarks of threadripper to talk about right now?

redeyes fucked around with this message at 16:40 on Jun 1, 2017

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Paul MaudDib posted:

Take an 8-core Ryzen 7, put Windows into high-performance mode to force it to clock up, then look at your CPU utilization percent while you're at your desktop. That's how much load you'll see during a game.

Like, how high do you think it is? 5% constant load? Probably less I'd think. The reality is that Chrome now throttles background tasks, etc. Unless you've got antivirus running or something, the rest of the system just doesn't eat that many cycles.

Having 16 threads available is already plenty to schedule minor background tasks on. The impact from going to 16 to 32 threads available for that is going to be nil, and the drop in clockrates (again, comparing a 4 GHz Ryzen 7 vs a 3 GHz threadripper) is going to negatively affect game performance.

Ryzen 7 is already leaning way far to the "lots of cores" side of things. Going further is not going to help game performance.

My background usage with chrome, discord, Skype, etc. Etc. On a 2600k is usually 20-30%.

I have too many chrome tabs...

On a fresh boot, it's usually 5-10%

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

PerrineClostermann posted:


On a fresh boot, it's usually 5-10%

That is ridiculously high for no applications running, you might be a botnet. Right now with a 2500k running Chrome, a Webex meeting I am not paying attention to, Skype, and Outlook I am at 6%. Most of that is Webex.

Sininu
Jan 8, 2014

PerrineClostermann posted:

My background usage with chrome, discord, Skype, etc. Etc. On a 2600k is usually 20-30%.

I have too many chrome tabs...

On a fresh boot, it's usually 5-10%

I feel something is very wrong with your computer. On my computer with Discord, Steam, Spotify and Chrome with lots of tabs running and nothing new loading I have 5-10% CPU usage. And while idling it has just 0-2% usage.
It's mobile i7 though so more cores.

Sininu fucked around with this message at 17:09 on Jun 1, 2017

redeyes
Sep 14, 2002

by Fluffdaddy

SinineSiil posted:

I feel something is very wrong with your computer. On my computer with Discord, Steam, Spotify and Chrome with lots of tabs running and nothing new loading I have 5-10% CPU usage. And while idling it has just 0-2% usage.
It's mobile i7 though so more cores.

i7 hyperthreaded cores are hard to guage on the CPU usage meters. You can hit near %100 and still taskman will be showing %50.

Adbot
ADBOT LOVES YOU

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Lockback posted:

That is ridiculously high for no applications running, you might be a botnet. Right now with a 2500k running Chrome, a Webex meeting I am not paying attention to, Skype, and Outlook I am at 6%. Most of that is Webex.

By "fresh boot" I mean "I just rebooted and started all the programs I use."

SinineSiil posted:

I feel something is very wrong with your computer. On my computer with Discord, Steam, Spotify and Chrome with lots of tabs running and nothing new loading I have 5-10% CPU usage. And while idling it has just 0-2% usage.
It's mobile i7 though so more cores.


I have an inordinate amount of chrome tabs, along with just a bunch of stuff running in the background, so I'm not particularly bothered. I have been meaning to do a fresh install though, just because this Windows install's been upgraded all the way from when W7 was the new hotness, and been across several drives...

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply