Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SwissArmyDruid
Feb 14, 2014

by sebmojo

eames posted:

yeah, you can even see the empty traces on the substrate. really weird to have the cpu chiplet — presumably the main source of heat dissipation — way off in a corner like that.



Based on everything we know, it should be..... fine. Maybe not great, but fine. I mean, Socket AM3/3+/4 cooling is a known value at this time, and considering that AMD's test benches there were running on air cooling, clearly AMD has already done the testing needed to make sure that this will work. I also don't expect AMD to stop soldering their IHSes anytime soon.

Now, on a 2-CCX 16C part? I'm thinking yeah, I may want to lap one or both of the cooler and the IHS, and/or get a full-coverage AIO waterblock, and *not* one of those Asetek designs with the round cooler.

SwissArmyDruid fucked around with this message at 21:38 on Jan 9, 2019

Adbot
ADBOT LOVES YOU

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Happy_Misanthrope posted:

Yeah if anyone is expecting the PS5 before late 2020 you're clueless

No one said anything about it launching in 2019. It will interesting to see performance once they get off jaguar cores and get Ryzen in there. Plus whatever new GPU tech that isn’t based on Polaris.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Dadbod Apocalypse posted:

This seems weird to me. I haven't owned a console since the Super Nintendo, so excuse what may be a dumb question, but...doesn't this mean that the two base systems will be fairly similar to one another? If so, the only substantive differentiations will the online ecosystems and any locked-up exclusives?

That's actually very much the case right now

They both use jaguar and some kind of Polaris

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Dadbod Apocalypse posted:

This seems weird to me. I haven't owned a console since the Super Nintendo, so excuse what may be a dumb question, but...doesn't this mean that the two base systems will be fairly similar to one another? If so, the only substantive differentiations will the online ecosystems and any locked-up exclusives?

The PS4 pro and Xbox one x have almost identical hardware, the GPU in the Xbox is slightly faster than the PS4 pro. Sony has been killing it with exclusives this generation though so folks might be more inclined to pick up a PS4 over Xbox despite being slightly slower.

Excited to see what performance gains are can be had with the new consoles by getting their slow jaguar CPUs, hoping for 60 FPS min for most games. The pro was my first console since the ps2.

Cygni
Nov 12, 2005

raring to post

9900k with better thermals is about what I expected, if not a little higher than I expected, and it seems they are in the range. Price (and Intel's inevitable response) will be interesting for sure. Also enjoy that a lot of what Adored had was wrong again, 'tis a tradition. :v:

If they truly can just swap that 3rd chiplet between another 8core cpu chiplet or a GPU chiplet, that is really fuckin cool. The theoretical 16core AM4 design would be wild, but I can't help but feel it would be extremely memory bandwidth constrained with just a dual channel bus?

Also looks like the answer to the PCIe 4 discussion is the new SKUs will "support" PCIe4, but its up to the motherboard manufactures to spend the money to put it in their designs. I imagine we won't see it in cheaper designs with fewer PCB layers.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Cygni posted:

9900k with better thermals is about what I expected, if not a little higher than I expected, and it seems they are in the range. Price (and Intel's inevitable response) will be interesting for sure. Also enjoy that a lot of what Adored had was wrong again, 'tis a tradition. :v:

If they truly can just swap that 3rd chiplet between another 8core cpu chiplet or a GPU chiplet, that is really fuckin cool. The theoretical 16core AM4 design would be wild, but I can't help but feel it would be extremely memory bandwidth constrained with just a dual channel bus?

Also looks like the answer to the PCIe 4 discussion is the new SKUs will "support" PCIe4, but its up to the motherboard manufactures to spend the money to put it in their designs. I imagine we won't see it in cheaper designs with fewer PCB layers.

Someone said that because PCIe 4 is mechanically the same, but not rated for over 7 inches of trace length, it may be possible to do a split design with a firmware update, that allows the first slot to function at PCIe 4, and the rest at 3.

Broose
Oct 28, 2007
I thought pcie 5 was just about done as well? Why even bother with 4? Does 5 even suffer from the problems that 4 has with signal retention? Google isn't being helpful, all I get is SSD reviews and poo poo.

repiv
Aug 13, 2009

https://twitter.com/IanCutress/status/1083092859962695680

https://twitter.com/IanCutress/status/1083099086880952320

BeastOfExmoor
Aug 19, 2003

I will be gone, but not forever.

Klyith posted:

The only other option would be ARM. As long as AMD is willing to license their stuff for the console makers to produce it themselves, and Intel & nvidia are not, it's gonna be AMD.

To do a console you have to have a licensed design so you can go get it produced yourself for the cheapest costs, and keep reducing those costs over the years. Buying chips on contract is putting yourself in a position to be taken advantage of.

Interesting. I had no idea that AMD didn't control manufacturing of the console chips.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."
eh nm

Dr. Fishopolis
Aug 31, 2004

ROBOT

Broose posted:

I thought pcie 5 was just about done as well? Why even bother with 4? Does 5 even suffer from the problems that 4 has with signal retention? Google isn't being helpful, all I get is SSD reviews and poo poo.

Who knows, but 16Gb/s is already way overkill for even high end workstations, so I dunno why they even bothered talking about skipping 4.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Cygni posted:

If they truly can just swap that 3rd chiplet between another 8core cpu chiplet or a GPU chiplet, that is really fuckin cool. The theoretical 16core AM4 design would be wild, but I can't help but feel it would be extremely memory bandwidth constrained with just a dual channel bus?

I thought that too but it was pointed out to me that Rome has that ratio of memory channels to cores (64 cores, 8 channels) and if it's not a problem on servers it certainly wouldn't be an issue on desktop. Keep in mind that Zen 2 has massive caches which will help out a bit with bandwidth requirements.

SwissArmyDruid
Feb 14, 2014

by sebmojo

BeastOfExmoor posted:

Interesting. I had no idea that AMD didn't control manufacturing of the console chips.

Klyith is incorrect. AMD handles design, production, and packaging of the silicon on their end, before shipping off to Sony and Microsoft's contractors.

The scenario that they were referring to, is more akin to what would happen if Sony and Microsoft were to license design blocks from ARM. But AMD very much controls how their IP is used and assembled, both now, and in the next generation consoles, which Sony and Microsoft are both already working with AMD on.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord

Looks like a vague dodge to not cannibalize TR until they announce more details closer to launch

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.

Risky Bisquick posted:

Looks like a vague dodge to not cannibalize TR until they announce more details closer to launch

I agree. It definitely feels like they're trying not to Osborne Effect their current lineup, especially at the high end.

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

I suspect the 16c configs are going to require the new chipset and mobo designs to support so they're holding off on that.

crazypenguin
Mar 9, 2005
nothing witty here, move along

Broose posted:

I thought pcie 5 was just about done as well? Why even bother with 4? Does 5 even suffer from the problems that 4 has with signal retention?

The signal quality thing is physics (or: materials/cost), so PCIe 5 will suffers those problems even worse. I have no idea why people talk about "skipping 4." PCIe 5 may never even appear in consumer hardware.

e: Most of the motivation for getting PCIe 5 out fast is to support faster networking hardware in servers.

Right now: 16x 8 Gbps = 100G ethernet
PCIe 4: 16x 16 Gbps = 200 G ethernet
PCIe 5: 16x 32 Gbps = 400 G ethernet

Right now a 200G ethernet card is a silly thing with a thick internal cable that connects to a daughter PCIe board, so you can plug them into 2 full 16x slots in the motherboard.

crazypenguin fucked around with this message at 22:43 on Jan 9, 2019

Klyith
Aug 3, 2007

GBS Pledge Week

SwissArmyDruid posted:

Klyith is incorrect. AMD handles design, production, and packaging of the silicon on their end, before shipping off to Sony and Microsoft's contractors.

The scenario that they were referring to, is more akin to what would happen if Sony and Microsoft were to license design blocks from ARM. But AMD very much controls how their IP is used and assembled, both now, and in the next generation consoles, which Sony and Microsoft are both already working with AMD on.

Huh, really? I was assuming the other way based on the xbox chip not being made at GloFo (years ago when AMD still had their GloFo exclusive contract) and previous stuff about intel & nvidia with the original Xbox.

Is there any public info about how the biz relationship works?

monsterzero
May 12, 2002
-=TOPGUN=-
Boys who love airplanes :respek: Boys who love boys
Lipstick Apathy
So, after VEGA VII, do you think the 8C Matisse going to be priced like a 2700x or a 9900k?

SwissArmyDruid
Feb 14, 2014

by sebmojo

Klyith posted:

Huh, really? I was assuming the other way based on the xbox chip not being made at GloFo (years ago when AMD still had their GloFo exclusive contract) and previous stuff about intel & nvidia with the original Xbox.

Is there any public info about how the biz relationship works?

Yeah. At no point in any reporting, are Sony or Microsoft ever described as "licensing" AMD's product, the same way you'd hear, say, Qualcomm used to do with ARM's core license.

That's because x86 cross-licensing is an INCREDIBLY PERILOUS jenga tower of poison pills, borne out of a shared cross-licensing agreement with Intel. If you remember the AMD/Hygon/THATIC shell game that has to be done, even though the products are sold under a different name, the ultimate holder of power here is still AMD, in that the majority AMD-controlled CHMT licenses AMD tech.... from AMD, while the minority AMD-controlled CHICD (responsible for design and sales) licenses the AMD blocks from CHMT, before passing the completed design back to CHMT for manufacture, usually by TSMC or some other foundry, before going back to CHICD.

If you're confused, don't worry, I'm not sure ANYONE actually understands how this setup works. Not even the lawyers that set it up.

While certainly, some of this shell game has to exist because China, since no such complicated shell game exists with the Sony and Microsoft setups, it's much more likely that AMD just built the chips to Sony/Microsoft spec, sends them off to TSMC for manufacture and packaging, and then sell them the chips on contract, not unlike selling chips to any other GPU board partner.

SwissArmyDruid fucked around with this message at 00:25 on Jan 10, 2019

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I nailed it on performance, but likely off on SKU.

I wonder if the 20CU Navi die isn't wrong now, just it's not a scavenger midrange Navi part. Like I think the expected die size of a 20CU part would be in the ~70nm region, wouldn't it?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

monsterzero posted:

So, after VEGA VII, do you think the 8C Matisse going to be priced like a 2700x or a 9900k?

AMD is not going to take the lead and undercut Intel on price - although total system cost will still likely be the same/lower with AMD's cheaper platform.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
The 20CU Navi would need HBM on there as well since it would be hilariously bandwidth constrained by dual channel DDR4.

sincx
Jul 13, 2012

furiously masturbating to anime titties
.

sincx fucked around with this message at 05:50 on Mar 23, 2021

Otakufag
Aug 23, 2004
I know Zen 2 is coming in the next 6 months, but if you want to game at 144hz/play single threaded dependent games like most mmrpgs or old ones like Starcraft 2, would it be wiser to just get a 9600k now instead of Ryzen? I know it's probably bad value due to low threads, but still...

NewFatMike
Jun 11, 2015

Otakufag posted:

I know Zen 2 is coming in the next 6 months, but if you want to game at 144hz/play single threaded dependent games like most mmrpgs or old ones like Starcraft 2, would it be wiser to just get a 9600k now instead of Ryzen? I know it's probably bad value due to low threads, but still...

Almost certainly not unless your current computer is actively broken.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Seeing that some games are already having weird frame time issues with 6/6 CPUs I'd be really wary of buying one, if you just play MMOs and older games though you probably would be fine.

https://www.youtube.com/watch?v=F92byoMgptU

redeyes
Sep 14, 2002

by Fluffdaddy

MaxxBot posted:

Seeing that some games are already having weird frame time issues with 6/6 CPUs I'd be really wary of buying one, if you just play MMOs and older games though you probably would be fine.

https://www.youtube.com/watch?v=F92byoMgptU

Looks like the difference in L2 Cache to me.

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.
Gamers Nexus talked to AMD's board partners and got a bit more info on Ryzen 3000 series. Apparently the X570 chipset isn't quite ready yet which is pushing back the launch of the CPUs since AMD wants to launch them together (but may launch the CPUs before the new chpiset if the chipset slips). X570 boards will all support PCIe 4.0 on the CPU lanes but the chipsets will keep PCIe 3.0 on the CPU side and PCIe 2.0 on the lanes off the chipset. Also, boards with 400-series chipsets could potentially support PCIe 4.0 on the CPU lanes with a Ryzen 3000 series CPU but it will be up to the board partners to validate their boards to run at those speeds.

https://www.youtube.com/watch?v=IcL4XFwptHo&t=322s

Mr.Radar fucked around with this message at 02:42 on Jan 10, 2019

Seamonster
Apr 30, 2007

IMMER SIEGREICH

Mr.Radar posted:

Gamers Nexus talked to AMD's board partners and got a bit more info on Ryzen 3000 series. Apparently the X570 chipset isn't quite ready yet which is pushing back the launch of the CPUs since AMD wants to launch them together (but may launch the CPUs before the new chpiset if the chipset slips). X570 boards will all support PCIe 4.0 on the CPU lanes but the chipsets will keep PCIe 3.0 on the CPU side and PCIe 2.0 on the lanes off the chipset. Also, boards with 400-series chipsets could potentially support PCIe 4.0 on the CPU lanes with a Ryzen 3000 series CPU but it will be up to the board partners to validate their boards to run at those speeds.

https://www.youtube.com/watch?v=IcL4XFwptHo&t=322s

I'm ok with this. Let the stocks of 3000 series parts build up a little bit so there won't be a shortage (read: price gouging) at launch.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

MaxxBot posted:

The 20CU Navi would need HBM on there as well since it would be hilariously bandwidth constrained by dual channel DDR4.

Maybe, but if the 20CU Navi is just Navi 12 as an entire chip, then it's not necessarily a loss, as it allows them to use virtually every Navi 12 chip, including ones with defective buses. Getting throttled to hell would be a feature for further product segmentation.

So I can foresee the 2019 line up as

Vega VII: 60 CU, 16GB HBM2, 4096 bit bus, RTX 2080 competitor
Navi 10 XT: 40CU, 8GB GDDR6, 256 bit bus, RTX 2070 competitor
Navi 10 Pro: 35CU, 8GB GDDR6, 256 bit bus, RTX 2060 competitor
Navi 12 XT: 20CU, 4GB GDDR6, 128 bit bus, GTX 2050 competitor
Navi 12 Pro: 15CU, 4GB GDDR6, 128 bit bus, GTX 1050ti competitor
Navi 12 XE: 20CU, Deactivated Bus, GTX 1050 competitor
Navi 12 LE: 15CU, Deactivated Bus, RX 560 replacement
Navi 12 SE: 10CU, Deactivated Bus, RX 550 replacement

The rumored Navi 14 may have been scrapped if Arcturus and related chips are a soon enough in 2020, as theoretically Navi 14 would only be replacing Vega VII with an 80CU chip to compete with the RTX 2080ti (and the cut down competing with the RTX 2080).

Mr.Radar posted:

Gamers Nexus talked to AMD's board partners and got a bit more info on Ryzen 3000 series. Apparently the X570 chipset isn't quite ready yet which is pushing back the launch of the CPUs since AMD wants to launch them together (but may launch the CPUs before the new chpiset if the chipset slips). X570 boards will all support PCIe 4.0 on the CPU lanes but the chipsets will keep PCIe 3.0 on the CPU side and PCIe 2.0 on the lanes off the chipset. Also, boards with 400-series chipsets could potentially support PCIe 4.0 on the CPU lanes with a Ryzen 3000 series CPU but it will be up to the board partners to validate their boards to run at those speeds.

https://www.youtube.com/watch?v=IcL4XFwptHo&t=322s

Actually really important to point IMHO is that Vega VII has 128 ROPs, not 64, and that may be the largest explanation for the increase in rendering power despite clocks and features not really going anywhere. AMD likely did not fiddle with the number of shader engines, so my guess is it's still on four shader engines but they've doubled resources in those shader engines. I think this points to Navi moving to a finalized eight shader engines before AMD moves on from GCN.

EmpyreanFlux fucked around with this message at 05:07 on Jan 10, 2019

NewFatMike
Jun 11, 2015

Doesn't GCN cap out at 64CUs?

Arzachel
May 12, 2012

K8.0 posted:

AMD is not going to take the lead and undercut Intel on price - although total system cost will still likely be the same/lower with AMD's cheaper platform.

They still have to compete with Ice Lake.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Here's another pic where the footprint for the second chiplet is clearly visible.

https://twitter.com/brianmacocq/status/1083269332338204672?s=19

Cygni
Nov 12, 2005

raring to post

EmpyreanFlux posted:

So I can foresee the 2019 line up as

Vega VII: 60 CU, 16GB HBM2, 4096 bit bus, RTX 2080 competitor
Navi 10 XT: 40CU, 8GB GDDR6, 256 bit bus, RTX 2070 competitor
Navi 10 Pro: 35CU, 8GB GDDR6, 256 bit bus, RTX 2060 competitor
Navi 12 XT: 20CU, 4GB GDDR6, 128 bit bus, GTX 2050 competitor
Navi 12 Pro: 15CU, 4GB GDDR6, 128 bit bus, GTX 1050ti competitor
Navi 12 XE: 20CU, Deactivated Bus, GTX 1050 competitor
Navi 12 LE: 15CU, Deactivated Bus, RX 560 replacement
Navi 12 SE: 10CU, Deactivated Bus, RX 550 replacement

why are you doing this to yourself

Otakufag
Aug 23, 2004
I decided to get a 2600x and just wait 3600x out instead of getting a 9600k. What's a couple good mobo recommendations that might serve me well even further until 2020 for Ryzen 3?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Cygni posted:

why are you doing this to yourself

Speculation never killed anyone and I already have a toxx. That and I'm bored asf.

eames
May 9, 2009

SwissArmyDruid posted:

Based on everything we know, it should be..... fine. Maybe not great, but fine. I mean, Socket AM3/3+/4 cooling is a known value at this time, and considering that AMD's test benches there were running on air cooling, clearly AMD has already done the testing needed to make sure that this will work.

I wonder if cooling manufacturers will come up with new, asymmetric designs for this and the somewhat obvious 2-chiplet variant.
Custom waterblock manufacturers would be the first to do that, rearranging the flow channels/microfins and entry/exit ports would be no big deal and it might drop temps by a degree or two under load.

Otakufag posted:

I know Zen 2 is coming in the next 6 months, but if you want to game at 144hz/play single threaded dependent games like most mmrpgs or old ones like Starcraft 2, would it be wiser to just get a 9600k now instead of Ryzen? I know it's probably bad value due to low threads, but still...

Too early to tell but one thing to keep in mind is that the memory controller of Zen 2 is on a different Die than the cores which could hurt latencies compared to the old single-die ringbus architecture.
Old single thread games at high FPS tend to scale well with low latencies, so I would not infer identical gaming performance results from identical cinebench runs.
The 6/6 9600k isn't great though, 8700k/9700k/9900k would make more sense now.

eames fucked around with this message at 13:33 on Jan 10, 2019

Otakufag
Aug 23, 2004

eames posted:

Too early to tell but one thing to keep in mind is that the memory controller of Zen 2 is on a different Die than the cores which could hurt latencies compared to the old single-die ringbus architecture.
Old single thread games at high FPS tend to scale well with low latencies, so I would not infer identical gaming performance results from identical cinebench runs.
The 6/6 9600k isn't great though, 8700k/9700k/9900k would make more sense now.
So latencies are going up no matter what? AMD isn't implementing something to counter that?

Adbot
ADBOT LOVES YOU

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Probably up moderately but they're going to be more consistent, especially when they drop in the second chiplet which is going to be good for typical home/gaming workloads which would get hurt by numa domain latency and bandwidth constraints you see with your threadripper and Epyc cpu's with multi-chiplet designs.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply