|
eames posted:yeah, you can even see the empty traces on the substrate. really weird to have the cpu chiplet — presumably the main source of heat dissipation — way off in a corner like that. Based on everything we know, it should be..... fine. Maybe not great, but fine. I mean, Socket AM3/3+/4 cooling is a known value at this time, and considering that AMD's test benches there were running on air cooling, clearly AMD has already done the testing needed to make sure that this will work. I also don't expect AMD to stop soldering their IHSes anytime soon. Now, on a 2-CCX 16C part? I'm thinking yeah, I may want to lap one or both of the cooler and the IHS, and/or get a full-coverage AIO waterblock, and *not* one of those Asetek designs with the round cooler. SwissArmyDruid fucked around with this message at 21:38 on Jan 9, 2019 |
# ? Jan 9, 2019 21:33 |
|
|
# ? Apr 26, 2024 16:42 |
|
Happy_Misanthrope posted:Yeah if anyone is expecting the PS5 before late 2020 you're clueless No one said anything about it launching in 2019. It will interesting to see performance once they get off jaguar cores and get Ryzen in there. Plus whatever new GPU tech that isn’t based on Polaris.
|
# ? Jan 9, 2019 21:35 |
|
Dadbod Apocalypse posted:This seems weird to me. I haven't owned a console since the Super Nintendo, so excuse what may be a dumb question, but...doesn't this mean that the two base systems will be fairly similar to one another? If so, the only substantive differentiations will the online ecosystems and any locked-up exclusives? That's actually very much the case right now They both use jaguar and some kind of Polaris
|
# ? Jan 9, 2019 21:36 |
|
Dadbod Apocalypse posted:This seems weird to me. I haven't owned a console since the Super Nintendo, so excuse what may be a dumb question, but...doesn't this mean that the two base systems will be fairly similar to one another? If so, the only substantive differentiations will the online ecosystems and any locked-up exclusives? The PS4 pro and Xbox one x have almost identical hardware, the GPU in the Xbox is slightly faster than the PS4 pro. Sony has been killing it with exclusives this generation though so folks might be more inclined to pick up a PS4 over Xbox despite being slightly slower. Excited to see what performance gains are can be had with the new consoles by getting their slow jaguar CPUs, hoping for 60 FPS min for most games. The pro was my first console since the ps2.
|
# ? Jan 9, 2019 21:40 |
|
9900k with better thermals is about what I expected, if not a little higher than I expected, and it seems they are in the range. Price (and Intel's inevitable response) will be interesting for sure. Also enjoy that a lot of what Adored had was wrong again, 'tis a tradition. If they truly can just swap that 3rd chiplet between another 8core cpu chiplet or a GPU chiplet, that is really fuckin cool. The theoretical 16core AM4 design would be wild, but I can't help but feel it would be extremely memory bandwidth constrained with just a dual channel bus? Also looks like the answer to the PCIe 4 discussion is the new SKUs will "support" PCIe4, but its up to the motherboard manufactures to spend the money to put it in their designs. I imagine we won't see it in cheaper designs with fewer PCB layers.
|
# ? Jan 9, 2019 21:45 |
|
Cygni posted:9900k with better thermals is about what I expected, if not a little higher than I expected, and it seems they are in the range. Price (and Intel's inevitable response) will be interesting for sure. Also enjoy that a lot of what Adored had was wrong again, 'tis a tradition. Someone said that because PCIe 4 is mechanically the same, but not rated for over 7 inches of trace length, it may be possible to do a split design with a firmware update, that allows the first slot to function at PCIe 4, and the rest at 3.
|
# ? Jan 9, 2019 21:51 |
|
I thought pcie 5 was just about done as well? Why even bother with 4? Does 5 even suffer from the problems that 4 has with signal retention? Google isn't being helpful, all I get is SSD reviews and poo poo.
|
# ? Jan 9, 2019 21:54 |
|
https://twitter.com/IanCutress/status/1083092859962695680 https://twitter.com/IanCutress/status/1083099086880952320
|
# ? Jan 9, 2019 21:55 |
|
Klyith posted:The only other option would be ARM. As long as AMD is willing to license their stuff for the console makers to produce it themselves, and Intel & nvidia are not, it's gonna be AMD. Interesting. I had no idea that AMD didn't control manufacturing of the console chips.
|
# ? Jan 9, 2019 22:01 |
|
eh nm
|
# ? Jan 9, 2019 22:08 |
|
Broose posted:I thought pcie 5 was just about done as well? Why even bother with 4? Does 5 even suffer from the problems that 4 has with signal retention? Google isn't being helpful, all I get is SSD reviews and poo poo. Who knows, but 16Gb/s is already way overkill for even high end workstations, so I dunno why they even bothered talking about skipping 4.
|
# ? Jan 9, 2019 22:08 |
|
Cygni posted:If they truly can just swap that 3rd chiplet between another 8core cpu chiplet or a GPU chiplet, that is really fuckin cool. The theoretical 16core AM4 design would be wild, but I can't help but feel it would be extremely memory bandwidth constrained with just a dual channel bus? I thought that too but it was pointed out to me that Rome has that ratio of memory channels to cores (64 cores, 8 channels) and if it's not a problem on servers it certainly wouldn't be an issue on desktop. Keep in mind that Zen 2 has massive caches which will help out a bit with bandwidth requirements.
|
# ? Jan 9, 2019 22:16 |
|
BeastOfExmoor posted:Interesting. I had no idea that AMD didn't control manufacturing of the console chips. Klyith is incorrect. AMD handles design, production, and packaging of the silicon on their end, before shipping off to Sony and Microsoft's contractors. The scenario that they were referring to, is more akin to what would happen if Sony and Microsoft were to license design blocks from ARM. But AMD very much controls how their IP is used and assembled, both now, and in the next generation consoles, which Sony and Microsoft are both already working with AMD on.
|
# ? Jan 9, 2019 22:16 |
|
repiv posted:https://twitter.com/IanCutress/status/1083092859962695680 Looks like a vague dodge to not cannibalize TR until they announce more details closer to launch
|
# ? Jan 9, 2019 22:19 |
|
Risky Bisquick posted:Looks like a vague dodge to not cannibalize TR until they announce more details closer to launch I agree. It definitely feels like they're trying not to Osborne Effect their current lineup, especially at the high end.
|
# ? Jan 9, 2019 22:24 |
|
I suspect the 16c configs are going to require the new chipset and mobo designs to support so they're holding off on that.
|
# ? Jan 9, 2019 22:31 |
|
Broose posted:I thought pcie 5 was just about done as well? Why even bother with 4? Does 5 even suffer from the problems that 4 has with signal retention? The signal quality thing is physics (or: materials/cost), so PCIe 5 will suffers those problems even worse. I have no idea why people talk about "skipping 4." PCIe 5 may never even appear in consumer hardware. e: Most of the motivation for getting PCIe 5 out fast is to support faster networking hardware in servers. Right now: 16x 8 Gbps = 100G ethernet PCIe 4: 16x 16 Gbps = 200 G ethernet PCIe 5: 16x 32 Gbps = 400 G ethernet Right now a 200G ethernet card is a silly thing with a thick internal cable that connects to a daughter PCIe board, so you can plug them into 2 full 16x slots in the motherboard. crazypenguin fucked around with this message at 22:43 on Jan 9, 2019 |
# ? Jan 9, 2019 22:32 |
|
SwissArmyDruid posted:Klyith is incorrect. AMD handles design, production, and packaging of the silicon on their end, before shipping off to Sony and Microsoft's contractors. Huh, really? I was assuming the other way based on the xbox chip not being made at GloFo (years ago when AMD still had their GloFo exclusive contract) and previous stuff about intel & nvidia with the original Xbox. Is there any public info about how the biz relationship works?
|
# ? Jan 9, 2019 23:26 |
|
So, after VEGA VII, do you think the 8C Matisse going to be priced like a 2700x or a 9900k?
|
# ? Jan 10, 2019 00:03 |
|
Klyith posted:Huh, really? I was assuming the other way based on the xbox chip not being made at GloFo (years ago when AMD still had their GloFo exclusive contract) and previous stuff about intel & nvidia with the original Xbox. Yeah. At no point in any reporting, are Sony or Microsoft ever described as "licensing" AMD's product, the same way you'd hear, say, Qualcomm used to do with ARM's core license. That's because x86 cross-licensing is an INCREDIBLY PERILOUS jenga tower of poison pills, borne out of a shared cross-licensing agreement with Intel. If you remember the AMD/Hygon/THATIC shell game that has to be done, even though the products are sold under a different name, the ultimate holder of power here is still AMD, in that the majority AMD-controlled CHMT licenses AMD tech.... from AMD, while the minority AMD-controlled CHICD (responsible for design and sales) licenses the AMD blocks from CHMT, before passing the completed design back to CHMT for manufacture, usually by TSMC or some other foundry, before going back to CHICD. If you're confused, don't worry, I'm not sure ANYONE actually understands how this setup works. Not even the lawyers that set it up. While certainly, some of this shell game has to exist because China, since no such complicated shell game exists with the Sony and Microsoft setups, it's much more likely that AMD just built the chips to Sony/Microsoft spec, sends them off to TSMC for manufacture and packaging, and then sell them the chips on contract, not unlike selling chips to any other GPU board partner. SwissArmyDruid fucked around with this message at 00:25 on Jan 10, 2019 |
# ? Jan 10, 2019 00:22 |
|
I nailed it on performance, but likely off on SKU. I wonder if the 20CU Navi die isn't wrong now, just it's not a scavenger midrange Navi part. Like I think the expected die size of a 20CU part would be in the ~70nm region, wouldn't it?
|
# ? Jan 10, 2019 00:24 |
|
monsterzero posted:So, after VEGA VII, do you think the 8C Matisse going to be priced like a 2700x or a 9900k? AMD is not going to take the lead and undercut Intel on price - although total system cost will still likely be the same/lower with AMD's cheaper platform.
|
# ? Jan 10, 2019 00:28 |
|
The 20CU Navi would need HBM on there as well since it would be hilariously bandwidth constrained by dual channel DDR4.
|
# ? Jan 10, 2019 01:20 |
|
.
sincx fucked around with this message at 05:50 on Mar 23, 2021 |
# ? Jan 10, 2019 01:22 |
|
I know Zen 2 is coming in the next 6 months, but if you want to game at 144hz/play single threaded dependent games like most mmrpgs or old ones like Starcraft 2, would it be wiser to just get a 9600k now instead of Ryzen? I know it's probably bad value due to low threads, but still...
|
# ? Jan 10, 2019 01:27 |
|
Otakufag posted:I know Zen 2 is coming in the next 6 months, but if you want to game at 144hz/play single threaded dependent games like most mmrpgs or old ones like Starcraft 2, would it be wiser to just get a 9600k now instead of Ryzen? I know it's probably bad value due to low threads, but still... Almost certainly not unless your current computer is actively broken.
|
# ? Jan 10, 2019 01:40 |
|
Seeing that some games are already having weird frame time issues with 6/6 CPUs I'd be really wary of buying one, if you just play MMOs and older games though you probably would be fine. https://www.youtube.com/watch?v=F92byoMgptU
|
# ? Jan 10, 2019 01:41 |
|
MaxxBot posted:Seeing that some games are already having weird frame time issues with 6/6 CPUs I'd be really wary of buying one, if you just play MMOs and older games though you probably would be fine. Looks like the difference in L2 Cache to me.
|
# ? Jan 10, 2019 01:52 |
|
Gamers Nexus talked to AMD's board partners and got a bit more info on Ryzen 3000 series. Apparently the X570 chipset isn't quite ready yet which is pushing back the launch of the CPUs since AMD wants to launch them together (but may launch the CPUs before the new chpiset if the chipset slips). X570 boards will all support PCIe 4.0 on the CPU lanes but the chipsets will keep PCIe 3.0 on the CPU side and PCIe 2.0 on the lanes off the chipset. Also, boards with 400-series chipsets could potentially support PCIe 4.0 on the CPU lanes with a Ryzen 3000 series CPU but it will be up to the board partners to validate their boards to run at those speeds. https://www.youtube.com/watch?v=IcL4XFwptHo&t=322s Mr.Radar fucked around with this message at 02:42 on Jan 10, 2019 |
# ? Jan 10, 2019 02:39 |
|
Mr.Radar posted:Gamers Nexus talked to AMD's board partners and got a bit more info on Ryzen 3000 series. Apparently the X570 chipset isn't quite ready yet which is pushing back the launch of the CPUs since AMD wants to launch them together (but may launch the CPUs before the new chpiset if the chipset slips). X570 boards will all support PCIe 4.0 on the CPU lanes but the chipsets will keep PCIe 3.0 on the CPU side and PCIe 2.0 on the lanes off the chipset. Also, boards with 400-series chipsets could potentially support PCIe 4.0 on the CPU lanes with a Ryzen 3000 series CPU but it will be up to the board partners to validate their boards to run at those speeds. I'm ok with this. Let the stocks of 3000 series parts build up a little bit so there won't be a shortage (read: price gouging) at launch.
|
# ? Jan 10, 2019 02:57 |
|
MaxxBot posted:The 20CU Navi would need HBM on there as well since it would be hilariously bandwidth constrained by dual channel DDR4. Maybe, but if the 20CU Navi is just Navi 12 as an entire chip, then it's not necessarily a loss, as it allows them to use virtually every Navi 12 chip, including ones with defective buses. Getting throttled to hell would be a feature for further product segmentation. So I can foresee the 2019 line up as Vega VII: 60 CU, 16GB HBM2, 4096 bit bus, RTX 2080 competitor Navi 10 XT: 40CU, 8GB GDDR6, 256 bit bus, RTX 2070 competitor Navi 10 Pro: 35CU, 8GB GDDR6, 256 bit bus, RTX 2060 competitor Navi 12 XT: 20CU, 4GB GDDR6, 128 bit bus, GTX 2050 competitor Navi 12 Pro: 15CU, 4GB GDDR6, 128 bit bus, GTX 1050ti competitor Navi 12 XE: 20CU, Deactivated Bus, GTX 1050 competitor Navi 12 LE: 15CU, Deactivated Bus, RX 560 replacement Navi 12 SE: 10CU, Deactivated Bus, RX 550 replacement The rumored Navi 14 may have been scrapped if Arcturus and related chips are a soon enough in 2020, as theoretically Navi 14 would only be replacing Vega VII with an 80CU chip to compete with the RTX 2080ti (and the cut down competing with the RTX 2080). Mr.Radar posted:Gamers Nexus talked to AMD's board partners and got a bit more info on Ryzen 3000 series. Apparently the X570 chipset isn't quite ready yet which is pushing back the launch of the CPUs since AMD wants to launch them together (but may launch the CPUs before the new chpiset if the chipset slips). X570 boards will all support PCIe 4.0 on the CPU lanes but the chipsets will keep PCIe 3.0 on the CPU side and PCIe 2.0 on the lanes off the chipset. Also, boards with 400-series chipsets could potentially support PCIe 4.0 on the CPU lanes with a Ryzen 3000 series CPU but it will be up to the board partners to validate their boards to run at those speeds. Actually really important to point IMHO is that Vega VII has 128 ROPs, not 64, and that may be the largest explanation for the increase in rendering power despite clocks and features not really going anywhere. AMD likely did not fiddle with the number of shader engines, so my guess is it's still on four shader engines but they've doubled resources in those shader engines. I think this points to Navi moving to a finalized eight shader engines before AMD moves on from GCN. EmpyreanFlux fucked around with this message at 05:07 on Jan 10, 2019 |
# ? Jan 10, 2019 05:01 |
|
Doesn't GCN cap out at 64CUs?
|
# ? Jan 10, 2019 05:08 |
|
K8.0 posted:AMD is not going to take the lead and undercut Intel on price - although total system cost will still likely be the same/lower with AMD's cheaper platform. They still have to compete with Ice Lake.
|
# ? Jan 10, 2019 09:35 |
|
Here's another pic where the footprint for the second chiplet is clearly visible. https://twitter.com/brianmacocq/status/1083269332338204672?s=19
|
# ? Jan 10, 2019 09:48 |
|
EmpyreanFlux posted:So I can foresee the 2019 line up as why are you doing this to yourself
|
# ? Jan 10, 2019 09:54 |
|
I decided to get a 2600x and just wait 3600x out instead of getting a 9600k. What's a couple good mobo recommendations that might serve me well even further until 2020 for Ryzen 3?
|
# ? Jan 10, 2019 11:58 |
|
Cygni posted:why are you doing this to yourself Speculation never killed anyone and I already have a toxx. That and I'm bored asf.
|
# ? Jan 10, 2019 12:05 |
|
SwissArmyDruid posted:Based on everything we know, it should be..... fine. Maybe not great, but fine. I mean, Socket AM3/3+/4 cooling is a known value at this time, and considering that AMD's test benches there were running on air cooling, clearly AMD has already done the testing needed to make sure that this will work. I wonder if cooling manufacturers will come up with new, asymmetric designs for this and the somewhat obvious 2-chiplet variant. Custom waterblock manufacturers would be the first to do that, rearranging the flow channels/microfins and entry/exit ports would be no big deal and it might drop temps by a degree or two under load. Otakufag posted:I know Zen 2 is coming in the next 6 months, but if you want to game at 144hz/play single threaded dependent games like most mmrpgs or old ones like Starcraft 2, would it be wiser to just get a 9600k now instead of Ryzen? I know it's probably bad value due to low threads, but still... Too early to tell but one thing to keep in mind is that the memory controller of Zen 2 is on a different Die than the cores which could hurt latencies compared to the old single-die ringbus architecture. Old single thread games at high FPS tend to scale well with low latencies, so I would not infer identical gaming performance results from identical cinebench runs. The 6/6 9600k isn't great though, 8700k/9700k/9900k would make more sense now. eames fucked around with this message at 13:33 on Jan 10, 2019 |
# ? Jan 10, 2019 13:26 |
|
eames posted:Too early to tell but one thing to keep in mind is that the memory controller of Zen 2 is on a different Die than the cores which could hurt latencies compared to the old single-die ringbus architecture.
|
# ? Jan 10, 2019 15:19 |
|
|
# ? Apr 26, 2024 16:42 |
|
Probably up moderately but they're going to be more consistent, especially when they drop in the second chiplet which is going to be good for typical home/gaming workloads which would get hurt by numa domain latency and bandwidth constraints you see with your threadripper and Epyc cpu's with multi-chiplet designs.
|
# ? Jan 10, 2019 15:37 |