Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SwissArmyDruid
Feb 14, 2014

by sebmojo

Potato Salad posted:

Motion to just use Zen, Zen Refresh, and Zen2?

No. Zen, Zen.5, and ZenTwo. :colbert:

edit: Zentoo. Just like my favorite distro of Linux. :v:







VVVVVV I mean, If I were in charge of naming, I would entertain the idea, but it's hard to strike the balance of "taking the piss out of Chipzilla" and "just copying them relentlessly".

SwissArmyDruid fucked around with this message at 19:54 on Sep 5, 2017

Adbot
ADBOT LOVES YOU

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Zen, Zen Lake, then Ryzen Lake

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Ryzen XP

Stanley Pain
Jun 16, 2001

by Fluffdaddy
Ryzen All In Wonder.

Sormus
Jul 24, 2007

PREVENT SPACE-AIDS
sanitize your lovebot
between users :roboluv:
Wait, instead of ticktock AMD has up-rebrand cycle? :V

nerox
May 20, 2001

Stanley Pain posted:

Ryzen All In Wonder.

An onboard tv tuner? :v:

An on board HDMI input would be neat/useful for a few things

Pablo Bluth
Sep 7, 2007

I've made a huge mistake.

Paul MaudDib posted:

Zen2 has always had the potential for greatness. Zen1 is already not half bad, if they can bump the IPC and clocks up a bit, fix the memory controller's touchiness, and get some of the errata fixed then Zen2 could easily be going toe-to-toe with Coffee Lake.
They should probably introduce AVX512 too. Even if it's not that relevant to most people, it gives a poor impression to lose badly in subset of benchmarks.

Sormus
Jul 24, 2007

PREVENT SPACE-AIDS
sanitize your lovebot
between users :roboluv:
Onboard analog tv-tuner

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



repiv posted:

new leak from wccftech



The compression blur just makes it :discourse:

Arzachel
May 12, 2012

Sormus posted:

Wait, instead of ticktock AMD has up-rebrand cycle? :V

Ryzen refresh is a supposed to be a straight respin with no architectural changes. Think Devil's Canyon except with actually significant clockspeed increases since we're already seeing 200-300mhz from minor steppings between Ryzen and Threadripper/Epyc.

Arzachel fucked around with this message at 20:43 on Sep 5, 2017

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

Pablo Bluth posted:

They should probably introduce AVX512 too. Even if it's not that relevant to most people, it gives a poor impression to lose badly in subset of benchmarks.
Well if Threadripper and Epyc are basically just lots of Ryzens glued together, not having 512 bit SIMD instructions takes AMD's server processors off the table for a lot of applications that could otherwise benefit from the memory address space/bandwidth and PCIe lanes.

Munkeymon posted:

The compression blur just makes it :discourse:
Yeah it's pretty much spot on.

Cygni
Nov 12, 2005

raring to post

repiv posted:

new leak from wccftech



yesssss

btw in case anyone was wondering where I was pulling that from, it was these articles on AT, mixed with some of the slides from AMDs Ryzen presentations.

GlobalFoundries Details 7 nm Plans: Three Generations, 700 mm˛, HVM in 2018

Samsung and TSMC Roadmaps: 8 and 6 nm Added, Looking at 22ULP and 12FFC

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Pablo Bluth posted:

They should probably introduce AVX512 too. Even if it's not that relevant to most people, it gives a poor impression to lose badly in subset of benchmarks.

They're not going to do that. It would take too much die space, and it would ruin them in power-consumption charts. Hell, they don't even do native AVX2 instructions (it's executed as a pair of AVX1 ops, effectively halving the throughput).

This has actually been a boon for them, because those charts measure power consumption, not power efficiency. Yeah, when running Prime95 a Skylake-X pulls a lot more power, but they're actually churning a hugely greater number of primes, so efficiency is actually much higher. But that's not what the charts are measuring. This also happens to be the only way you get those hilarious 300W figures that melt VRMs and poo poo, Prime95 is a ridiculously unrealistic load and virtually any other task drops power consumption to a fraction of that. AMD would be exposing themselves to similar problems/discourse about their processors.

So yeah, slow AVX performance that reduces die space and leaves you looking good in power-consumption charts is probably an overall boon for AMD, vs a pyrrhic victory chasing Intel's performance.

The place where rubber meets the road is media creation though (encoding and rendering). x265 is very good at using AVX512 and this is one of the few benchmarks where the 7900X blows out the 1950X, despite the latter's 60% advantage in core count. It also substantially closes the gap in x264 and Blender rendering, since these are AVX512-aware as well. The 1950X still wins in these, but the 7900X closes the gap down to 5-20% despite its much lower core count, thanks to AVX512.



AMD's marketing has revolved around the idea that everyone is suddenly streaming or doing 3D rendering, so it's pretty funny to see them underperforming so drastically in those tasks. AVX512 is actually pretty good for workstation-type users, and is important in various kinds of HPC workloads as well. Not everyone is a workstation user, of course, but AMD's marketing has really been pushing the idea that you need a 16-core NUMA setup to leave a couple tabs open in Chrome while you game.

(it's even more hilarious that when GN did testing they determined that (while they could not eliminate the null hypothesis), Intel's G4560 actually outperformed the 4-core Ryzen parts when they added background load)

Paul MaudDib fucked around with this message at 21:28 on Sep 5, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
They will be doing moving to AVX-2 though, and just pair AVX-2 ops for AVX-512, like they've explicitly stated they want to do this for Zen2. I don't think AMD will ever go native AVX-512 until they're confident in their market position.

EDIT: To be clear, I think this is where the APUs and CPUs will diverge, so 7nm Zen2 APUs will not have AVX-512 capability, and will probably have less PCIE lanes and I/O as well.

EmpyreanFlux fucked around with this message at 21:39 on Sep 5, 2017

repiv
Aug 13, 2009

Paul MaudDib posted:

Hell, they don't even do native AVX2 instructions (it's executed as a pair of AVX1 ops, effectively halving the throughput).

Zen doesn't do full-rate AVX1 either, it breaks all 256-bit SIMD ops into a pair of 128-bit ops (effectively decomposing AVX into SSE).

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

If you're doing that kind of high throughput encoding, wouldn't it be insanely faster and more efficient to offload that work to whatever Firepro card assuming you're inside the AMD ecosystem anyhow?

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Is AVX-512 even going to become a big deal at some point? I thought it's generally been viewed as going the way of AltiVec and currently it's just a way for Intel to put big numbers in certain benchmarks for the "MAH FRAMEZ" crowd.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BangersInMyKnickers posted:

If you're doing that kind of high throughput encoding, wouldn't it be insanely faster and more efficient to offload that work to whatever Firepro card assuming you're inside the AMD ecosystem anyhow?

Yes, streamers are typically better off using GPU encoding. CPU encoding has better quality but it's difficult to achieve this in real-time, and if you are going to attempt to do so then the gold standard is to do the encoding on a dedicated machine since anything that exceeds the quality of GPU encoding is also going to poo poo up your framerate something fierce, even on an 8-core processor.

Which, again, is why AMD's marketing is deceptive here. Regular users are better off using GPU encoding due to substantially lower impacts on framerate, ~*professional streamers*~ are better off getting a dedicated machine to handle encoding.

Yaoi Gagarin
Feb 20, 2014

Paul MaudDib posted:

Yes, streamers are typically better off using GPU encoding. CPU encoding has better quality but it's difficult to achieve this in real-time, and if you are going to attempt to do so then the gold standard is to do the encoding on a dedicated machine since anything that exceeds the quality of GPU encoding is also going to poo poo up your framerate something fierce, even on an 8-core processor.

Which, again, is why AMD's marketing is deceptive here. Regular users are better off using GPU encoding due to substantially lower impacts on framerate, ~*professional streamers*~ are better off getting a dedicated machine to handle encoding.

Seems hard, you'd need a dedicated machine that can really rip through several threads at once

repiv
Aug 13, 2009

Paul MaudDib posted:

Yes, streamers are typically better off using GPU encoding. CPU encoding has better quality but it's difficult to achieve this in real-time, and if you are going to attempt to do so then the gold standard is to do the encoding on a dedicated machine since anything that exceeds the quality of GPU encoding is also going to poo poo up your framerate something fierce, even on an 8-core processor.

Which, again, is why AMD's marketing is deceptive here. Regular users are better off using GPU encoding due to substantially lower impacts on framerate, ~*professional streamers*~ are better off getting a dedicated machine to handle encoding.

To be fair to AMD, GPU-encoded Twitch streams looked pretty bad around the time of the Ryzen launch. Unfortunately for them Twitch almost doubled the bitrate limit a month later so it's far more forgiving of encoder quality now.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

SourKraut posted:

Is AVX-512 even going to become a big deal at some point? I thought it's generally been viewed as going the way of AltiVec and currently it's just a way for Intel to put big numbers in certain benchmarks for the "MAH FRAMEZ" crowd.

50+% per-core performance improvement is ridiculously good

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Paul MaudDib posted:

Yes, streamers are typically better off using GPU encoding. CPU encoding has better quality but it's difficult to achieve this in real-time, and if you are going to attempt to do so then the gold standard is to do the encoding on a dedicated machine since anything that exceeds the quality of GPU encoding is also going to poo poo up your framerate something fierce, even on an 8-core processor.

Which, again, is why AMD's marketing is deceptive here. Regular users are better off using GPU encoding due to substantially lower impacts on framerate, ~*professional streamers*~ are better off getting a dedicated machine to handle encoding.

I was talking about doing it in software on openCL vs on cpu, not hardware GPU encode.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.
GPUs are good at embarrassingly parallel tasks that can run on shitloads of threads that don't need information from other threads. This is fundamentally opposed to how most of video encoding works; x264 begins to lose quality above a certain number of threads (16-22 on 1080p?) because the video is being split up into pieces too small. A high-end GPU, on the other hand, will want to be using tens of thousands of threads.

You can accelerate parts of the encoding process using OpenCL but every pure-GPU video encoder I've seen (that wasn't a dedicated chip) has looked like poo poo. And I'm pretty sure the CPU-only parts still end up being the bottleneck.

Winifred Madgers
Feb 12, 2002

SwissArmyDruid posted:

No. Zen, Zen.5, and ZenTwo. :colbert:

edit: Zentoo. Just like my favorite distro of Linux. :v:







VVVVVV I mean, If I were in charge of naming, I would entertain the idea, but it's hard to strike the balance of "taking the piss out of Chipzilla" and "just copying them relentlessly".

Zen, Zen 360, Zen One?

Or no, wait, I've got it. Zen, Zen 2.0 Full Speed, Zen 2.0 High Speed.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Zen 3.0, Zen 3.1, Zen 3.1 Gen 1, Zen 3.1 Gen 2

Zen-C

isndl
May 2, 2012
I WON A CONTEST IN TG AND ALL I GOT WAS THIS CUSTOM TITLE
Zen, Zen+, Zen++, Zen#, #Zen, @Zen

I kinda want to see the last two now just to see how it fucks with social media.

AzraelNewtype
Nov 9, 2004

「ブレストバーン!!」

Malloc Voidstar posted:

You can accelerate parts of the encoding process using OpenCL but every pure-GPU video encoder I've seen (that wasn't a dedicated chip) has looked like poo poo. And I'm pretty sure the CPU-only parts still end up being the bottleneck.

https://www.youtube.com/watch?v=_6XYaFqq2mg

In case you don't want to watch, the takeaway is that a GTX 660 gives Premiere's encoder exactly as much of a boost as a Vega 64. Unless x264's OpenCL branch is offloading significantly more to the GPU (and I very much doubt it does), you're super correct. I'm curious how weak/old a GPU would have to be to give worse results than the current generation if four years old is indistinguishable still, but that's sort of a moot point.

Kazinsal
Dec 13, 2011



You guys are going about this all wrong. AMD knows exactly how to push Intel's buttons now that Zen is actually competitive.

Ladies and gentlemen...




Zen Lake.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

AzraelNewtype posted:

https://www.youtube.com/watch?v=_6XYaFqq2mg

In case you don't want to watch, the takeaway is that a GTX 660 gives Premiere's encoder exactly as much of a boost as a Vega 64. Unless x264's OpenCL branch is offloading significantly more to the GPU (and I very much doubt it does), you're super correct. I'm curious how weak/old a GPU would have to be to give worse results than the current generation if four years old is indistinguishable still, but that's sort of a moot point.

This is one of those rare videos where you should actually read the comments, the bottle neck they saw had nothing to do with the hardware and everything to do with the software.

Pablo Bluth
Sep 7, 2007

I've made a huge mistake.
The devs who do the performance work on Photoshop/Lightroom/Premiere are the ones who failed to cut it in the Flash security team...

Nalin
Sep 29, 2007

Hair Elf

Pablo Bluth posted:

The devs who do the performance work on Photoshop/Lightroom/Premiere are the ones who failed to cut it in the Flash security team...

When my friend was still doing video work in Premiere for a living, Adobe became a curse word.

A SWEATY FATBEARD
Oct 6, 2012

:buddy: GAY 4 ORGANS :buddy:
Ryzen update: I indeed got dud RAM sticks, the replacement memory runs like charm at 3200MHz. On the other end, 1700X isn't really overclockable, you can bump it to 3.7GHz for example but the system will crap out in a matter of minutes. I can only run it at stock 3.4GHz, which is a minor disappointment but oh well. :)

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy

A SWEATY FATBEARD posted:

Ryzen update: I indeed got dud RAM sticks, the replacement memory runs like charm at 3200MHz. On the other end, 1700X isn't really overclockable, you can bump it to 3.7GHz for example but the system will crap out in a matter of minutes. I can only run it at stock 3.4GHz, which is a minor disappointment but oh well. :)

what are you using to cool it?

A SWEATY FATBEARD
Oct 6, 2012

:buddy: GAY 4 ORGANS :buddy:

Scarecow posted:

what are you using to cool it?

Arctic cooling Freezer 33.

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy
yeah that could be your problem, what temps are you hitting with it?

A SWEATY FATBEARD
Oct 6, 2012

:buddy: GAY 4 ORGANS :buddy:

Scarecow posted:

yeah that could be your problem, what temps are you hitting with it?

At stock speed, under full load, 60C.

rex rabidorum vires
Mar 26, 2007

KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN

A SWEATY FATBEARD posted:

At stock speed, under full load, 60C.

Lol did you give the chip some more voltage?

Cygni
Nov 12, 2005

raring to post

Bristol Ridge available at retail if you really really need an APU and cant wait until Raven Ridge http://www.anandtech.com/show/11819/amd-bristol-ridge-apu-retail-available

Anime Schoolgirl
Nov 28, 2002

and as usual the (7, 9)600 is the only one worth getting as it's the only product in the stack that makes sense in ultra-cheapo-but-not-terribly-poo poo builds

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo
I would argue not even, since these are only 35W parts. I mean, what ultra-cheapo-but-not-terribly-poo poo build do you have to have that you can't do with 65W and the commensurate perf bump?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply