Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Paul MaudDib posted:

I think they're going to go even lower and that may pose a problem for the new chips.

Would you buy a 4700X at $350 or would you buy a 3700X at $200?

Depends a lot on the performance delta, but I would quite likely go with the 4700X because amortized over the lifetime of the CPU or the number of things I’m going to render/slice/compile/virtualize, $150 isn’t a big deal. That plump unified L3 looks pretty nice!

E: Of course, it’ll be a $350 price gap in Canada, sooo

Adbot
ADBOT LOVES YOU

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

ConanTheLibrarian posted:

Great, now I can be underwhelmed by Zen3 just like I was with Zen2 and put off replacing my 3570k in anticipation of the next ryzen release.

Not saying that your standards for replacement aren't valid, but what ratio of price-to-performance increase are you looking at here?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

MikeC posted:

Not sure where you are getting APUs skipping RDNA2 from those slides.

RDNA3 making it into 2021 makes me feel like it's less than 12 months for RDNA2 and thus most of the design work would be done. Further Cezanne is apparently Vega, not a flavor of RDNA, which means the 2021 APU is Zen3/Vega on 7nm. Further RDNA3 is a native 5nm design, and Rembrandt is supposed to be a 5nm design as well.

So if they already have experience with RDNA3 and the next iGPU is Vega, and the next APU after is 5nm, it seems like less design work overall to just skip to 5nm RDNA3? Like if RDNA3 was 2022 I could see the argument for an RDNA2 5nm iGPU being stronger but if it's less than a year to RDNA3 it seems much weaker.

There is Van Gogh, which is supposedly 7nm Zen2/RDNA2 but my understanding is that's Microsoft asking for a Surface chip and the target is low power, sub 10w.

EmpyreanFlux fucked around with this message at 16:09 on Jul 29, 2020

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
gently caress it, I would buy a GPU with the code name Van Gogh regardless of performance

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Nomyth posted:

gently caress it, I would buy a GPU with the code name Van Gogh regardless of performance

spend forever trying to figure out why it only outputs audio in mono

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Gets a 3rd party driver update that triples its performance after the model loses support and EOL

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Nomyth posted:

Gets a 3rd party driver update that triples its performance after the model loses support and EOL

So Linux Mesa drivers.

SwissArmyDruid
Feb 14, 2014

by sebmojo

EmpyreanFlux posted:

RDNA3 making it into 2021 makes me feel like it's less than 12 months for RDNA2 and thus most of the design work would be done. Further Cezanne is apparently Vega, not a flavor of RDNA, which means the 2021 APU is Zen3/Vega on 7nm. Further RDNA3 is a native 5nm design, and Rembrandt is supposed to be a 5nm design as well.

So if they already have experience with RDNA3 and the next iGPU is Vega, and the next APU after is 5nm, it seems like less design work overall to just skip to 5nm RDNA3? Like if RDNA3 was 2022 I could see the argument for an RDNA2 5nm iGPU being stronger but if it's less than a year to RDNA3 it seems much weaker.

There is Van Gogh, which is supposedly 7nm Zen2/RDNA2 but my understanding is that's Microsoft asking for a Surface chip and the target is low power, sub 10w.

....do we *have* RDNA 1 APUs, aside from what's in the consoles? Like, one of my continued complaints about AMD is that GCN.

Just.

Won't.

loving.

Die.

And for all their talk about how RDNA is a vastly-improved and more efficient version of Vega, they still shoved Vega into the 4000-series APUs instead of something RDNA-derived.

RDNA in the form of the Radeon 5000-series graphics cards has been with us for over a year now. And based on how bad the launch drivers were, and how owners were waiting FOR MONTHS for bugfixes, I opined that RDNA 1 was not something that AMD wanted to spend a lot of money on, and that it was the culmination of all of the lessons that they had learned from developing the console chips, slapped together into a discrete product that would further stretch the money that they had gotten from Microsoft and Sony. Additionally, it also felt like something they were shoving out just to keep video game benchmark and reviews from just being Nvidia vs. Nvidia, and that they would much rather be spending their time, money, and driver programmer man-hours on RDNA 2, whose cards we are getting this year.

But back on point, I feel like RDNA 2 APUs are more likely than RDNA 1 APUS and then skipping straight to RDNA3.

SwissArmyDruid fucked around with this message at 16:18 on Jul 29, 2020

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

SwissArmyDruid posted:

And for all their talk about how RDNA is a vastly-improved and more efficient version of Vega, they still shoved Vega into the 4000-series APUs instead of something RDNA-derived.

On the one hand this is because Vega is really nicely performant at lower power regimes, but scales up horribly, so APUs are a great place to let it shine. On the other hand, RDNA has been out for a year now, so there's definitely an argument to be made for putting RDNA in 4000 series APUs. I'm sure it would be very interesting to know the facts behind why AMD made this decision.

quote:

RDNA 1 was not something that AMD wanted to spend a lot of money on, and that it was the culmination of all of the lessons that they had learned from developing the console chips, slapped together into a discrete product that would further stretch the money that they had gotten from Microsoft and Sony

I don't think it's a secret that RDNA as it stands today is "GCN4, but optimized for video game engines rather than trying to also be good at compute" -- AKA "what Nvidia has been doing for several years".

But my assumption is that the money from console work came in after development of RDNA. I have no statements from the companies involved to back this up, but through the lens of my experience at a hyperscale datacenter operator, evaluating pre-prod CPUs from Intel and AMD, I reckon it was something like:
  • AMD desperately needs a graphics solution that more closely matches the architectures that Nvidia is pushing to the consumer side, for market reasons
  • RDNA gets pushed out the door last July, with software that was absolutely underbaked
  • But the hardware is solid, and console releases are 18 months out, so the semicustom division and Sony/MS agree to PS5/XSX on an RDNA baseline (which also looks good to the market)
  • There had to have been at least engineering verification test grade (and probably more like production verification) silicon for Sony and MS to sign off on this
  • Therefore I assume AMD fronted the development costs themselves
  • Though console contracts make for a really good and safe way to get paid back for that work. Safer than the consumer market anyway

Klyith
Aug 3, 2007

GBS Pledge Week

SwissArmyDruid posted:

And for all their talk about how RDNA is a vastly-improved and more efficient version of Vega, they still shoved Vega into the 4000-series APUs instead of something RDNA-derived.

RDNA in the form of the Radeon 5000-series graphics cards has been with us for over a year now. And based on how bad the launch drivers were, and how owners were waiting FOR MONTHS for bugfixes, I opined that RDNA 1 was not something that AMD wanted to spend a lot of money on, and that it was the culmination of all of the lessons that they had learned from developing the console chips, slapped together into a discrete product that would further stretch the money that they had gotten from Microsoft and Sony.

The consoles have what they're calling RDNA 2, so given that and the release sequence I think it's pretty obvious that RDNA 1 was the work-in-progress version where they'd gotten the basic GPU pipeline redesigned, but hadn't finished the next-gen features. And based on the issues with 5700s I think it's more than just drivers at fault, there are probably some quirks in silicon that they just can't totally solve with software. I doubt it's a question of money, this is the product line they have and they'd be fools to slack on support just because :effort:. It's not like AMD is penniless these days either.

Owning a 5700, it feels like a prototype. I haven't had enough problems to make me hate it (and getting it at a very good price helps), but it's frequently just a bit whack. Like, recently I've been seeing an occasional corruption sparkle effect around the mouse cursor when the cursor is over a youtube video. Harmless but whack.

SwissArmyDruid posted:

But back on point, I feel like RDNA 2 APUs are more likely than RDNA 1 APUS and then skipping straight to RDNA3.
Yeah given that they've got RDNA 2 APUs in the consoles that's pretty clearly what they're gonna do.

Also as much as Vega sucks, I don't think it makes that much difference in an APU. Vega sucked as a high-end GPU because it couldn't effectively use the bandwidth of HBM. In an APU that's not a problem.


e:

mdxi posted:

I don't think it's a secret that RDNA as it stands today is "GCN4, but optimized for video game engines rather than trying to also be good at compute"
David Kanter disagrees with you.

Klyith fucked around with this message at 17:23 on Jul 29, 2020

SwissArmyDruid
Feb 14, 2014

by sebmojo

Klyith posted:

Also as much as Vega sucks, I don't think it makes that much difference in an APU. Vega sucked as a high-end GPU because it couldn't effectively use the bandwidth of HBM. In an APU that's not a problem.

In a pre-DDR5 APU, that's not a problem, but hopefully we can finally put a goddamn stake in the GCN zombie.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Klyith posted:

like, recently I've been seeing an occasional corruption sparkle effect around the mouse cursor when the cursor is over a youtube video. Harmless but whack.

Have you tried testing your system RAM? I found I was getting weird artifacts only while gaming and it turned out to be my RAM couldn't deal with the heat the card was pumping out and it made my entire system unstable while gaming. I played around with my system RAM sub timings a bit and I haven't had a crash in 4 months when it used to be every 2-3 hours I could expect a burst of artifacts or blue screen referencing the video driver.

VorpalFish
Mar 22, 2007
reasonably awesometm

Paul MaudDib posted:

I think they're going to go even lower and that may pose a problem for the new chips.

Would you buy a 4700X at $350 or would you buy a 3700X at $200?

I doubt zen2 will coexist with zen3 as a lower cost alternative the way zen+ has with zen2 since they're likely fabbed on the same process and there's not exactly a lot of capacity available on tsmc7.

Doesn't seem worth it to port to another process either.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Klyith posted:

Owning a 5700, it feels like a prototype. I haven't had enough problems to make me hate it (and getting it at a very good price helps), but it's frequently just a bit whack. Like, recently I've been seeing an occasional corruption sparkle effect around the mouse cursor when the cursor is over a youtube video. Harmless but whack.

I also have a 5700, and use it under Linux. For the first two weeks of its life it couldn't even run in a 2D accelerated mode; it was a $300 1024x768 VESA framebuffer. After that the drivers improved quickly though, and since Mesa 20.0.1 it's been very stable for me. I never saw the kind of issues that you describe though; for me it was pseudorandom lockups. I say pseudorandom, because it only ever happened with Chrome focused -- even playing Windows games through Proton under Steam was safe -- but I could never find a commonality beyond that. So it either worked perfectly, or crashed the system with Chrome foregrounded.

I'm not sure how, but the AMDGPU drivers seem to be consistently higher quality than the Windows side of the house for these cards. Windows users on Reddit still say things like "the 5700 idles hot!" too, meanwhile my card idles at 8 (eight) watts and 45C. It's a weird card, and the drivers are weird, and the whole situation is weird. I'd blame 2020, but this all started in 2019.

quote:

David Kanter disagrees with you.

I'm a sysadmin not a CompE, so I was going on what I had read from various places. If it ain't, then it ain't.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

SwissArmyDruid posted:

In a pre-DDR5 APU, that's not a problem, but hopefully we can finally put a goddamn stake in the GCN zombie.

GCN the architecture or GCN the ISA? The latter is fine iirc

e: thought rdna was basically gcn6 isa, it's a new thing too: https://developer.amd.com/wp-content/resources/RDNA_Shader_ISA.pdf

Malcolm XML fucked around with this message at 18:23 on Jul 29, 2020

Klyith
Aug 3, 2007

GBS Pledge Week

pixaal posted:

Have you tried testing your system RAM? I found I was getting weird artifacts only while gaming and it turned out to be my RAM couldn't deal with the heat the card was pumping out and it made my entire system unstable while gaming. I played around with my system RAM sub timings a bit and I haven't had a crash in 4 months when it used to be every 2-3 hours I could expect a burst of artifacts or blue screen referencing the video driver.

System ram is good unless it is a recent failure. Last winter I spend a while tweaking until I had solid settings for 3200-C16 (the ram sticks are XMP 3000-C15, but I had zero luck with geardown & the setting that allows my zen 1 cpu to run it at odd cas numbers. Conclusion of that project was an overnight run of memtest with no errors.

Since it's summer I have the GPU set to negative power target, and I suspect that is what causes the occasional flicker. Not a big enough deal for me to reset that, I'd rather have less heat than a perfect mouse cursor. But I think the power management system is one of the things that wasn't quite done when they pushed it out the door. I had much worse issues with power targets & fan curves last year. As long as it's just the cursor being weird and not full crashes, I'm ok leaving it as is.

mdxi posted:

I'm not sure how, but the AMDGPU drivers seem to be consistently higher quality than the Windows side of the house for these cards. Windows users on Reddit still say things like "the 5700 idles hot!" too, meanwhile my card idles at 8 (eight) watts and 45C. It's a weird card, and the drivers are weird, and the whole situation is weird. I'd blame 2020, but this all started in 2019.

AMD made bigger areas of the linux driver open-source, that probably helps. Both for linux nerds fixing their driver for them, and also it means the kernel and window manager people being able to see inside the driver APIs rather than just poke at a black box.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

But back on point, I feel like RDNA 2 APUs are more likely than RDNA 1 APUS and then skipping straight to RDNA3.

I'm not sure where I mentioned we might have RDNA1 APU's? My impression is that RDNA is kind of halfbaked and forced out there whereas RDNA2 is the actual feature complete uarch, with RDNA3 likely improving on efficiency and ray tracing. I don't think we we'd ever see an RDNA1 APU, it makes way more sense to jump to RDNA2 at least, but if RNDA2 and RDNA3 were essentially parallel projects like Zen2 and Zen3 were, Rembrandt is likely RDNA3 solely based on the fact they'd both be 5nm native designs, rather than trying to port RDNA2 to 5nm. That's just extra, pointless work IMHO.

If RDNA3 wasn't ready at all in time for Rembrandt then yeah they definitely went with RDNA2.

EDIT: FWIW, Cezanne is listed as having "Vega20" which I don't even know how the gently caress that doesn't get bottlenecked without insane compression or HBM. https://www.igorslab.de/en/ryzen-4000-vermeer-im-b0-stepping-factual-market-ripe-cezanne-with-vega-andvan-gogh-with-navi-one-step-back/. If that's meant for desktop and premium laptops AMD is going to crush Tiger Lakes hopes and dreams like a soda can, even a pessimistic 50% improvement is roughly Tahiti, at 90% performance scaling you're looking at just trailing a GTX 1650 (and matching it on desktop where you can push it)

EmpyreanFlux fucked around with this message at 00:42 on Jul 30, 2020

FuturePastNow
May 19, 2014


Aside from the custom console silicon, AMD's APU graphics have always been a generation behind the discrete GPUs and I don't imagine that will change.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Klyith posted:


Since it's summer I have the GPU set to negative power target, and I suspect that is what causes the occasional flicker.


That is probably the problem, my 5700 does not like lower power targets, mine is more stable with a power target above stock, I have the sliders at their max. I'm running my RAM at DDR3666CL20 DDR3666CL18 was not stable with the GPU though tests perfectly fine if you don't engage the GPU at the same time.

edit: sorry not at 4k, summer time and IF isn't stable at 2000, can only get it to 1833 still running at CL20, might be able to get it at CL18 now.

sauer kraut
Oct 2, 2004
Hey since a lot of you seem to be super into those APUs, Igor just posted a warning article about Renoir on mobile platforms. It's limited to PCIE 3.0 x8 for the dGPU on those.
If you're looking to buy of them prebuild NUC boxes, double check what kind of mainboard is used in there and how the lanes are setup.

orcane
Jun 13, 2012

Fun Shoe
The mobile CPUs are limited so wouldn't you check those instead of the mainboard?

If that NUC box is using the newly released Renoir desktop APUs it should be fine.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
you aren't putting a dGPU in a NUC anyway, not sure why that would matter

socketed APUs will have a full x16 on the graphics lanes (although possibly still 3.0 only?)

the bigger bummer is that the undersized cache is starting to have a significant impact on IPC. Renoir is closer to Pinnacle Ridge IPC than to Matisse. Supposedly the move was made to reduce power consumption (especially idle power) on mobile but desktop doesn't care about that. So you are in a situation where a 10700K is going to have noticeably better IPC and clocks than a 4700G (or 10600K vs 3600G or whatever) and office builds don't care about having a giant iGPU.

https://www.hardwaretimes.com/amds-ryzen-7-4700g-renoir-desktop-is-slower-than-the-ryzen-5-3600-in-gaming-w-dgpus/

Paul MaudDib fucked around with this message at 21:03 on Jul 30, 2020

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Paul MaudDib posted:

you aren't putting a dGPU in a NUC anyway, not sure why that would matter

Yeah, you can. That was one of the big points of the Intel NUC 9 stuff released this year.

Llamadeus
Dec 20, 2005
Why would that be relevant for an AMD chip though?

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Llamadeus posted:

Why would that be relevant for an AMD chip though?

I just saw a mention of NUC and since that's kinda an Intel branded thing, I thought that comparison was being made.

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug
https://twitter.com/CommieGIR/status/1289026703508070401?s=20

lDDQD
Apr 16, 2006

EmpyreanFlux posted:

EDIT: FWIW, Cezanne is listed as having "Vega20" which I don't even know how the gently caress that doesn't get bottlenecked without insane compression or HBM.
Lol, Vega 20 is Radeon VII. Reading that link, it looks like they datamined some configuration files from a beta driver or something; and then go on to make the conclusion that another APU will have a Navi 21, which is equally ridiculous. More likely, is these APUs simply share the same version of IP blocks found in Vega 20; Navi 21 (respectively), and the driver will execute that code path.

lDDQD fucked around with this message at 03:48 on Jul 31, 2020

Seamonster
Apr 30, 2007

IMMER SIEGREICH

I wonder how much of that $600 cost (to consumer) is the DDR4 FRICKIN 4266. I can't even find SO-DIMMS faster than 3200 on newegg. That is some blazing fast RAM yeesh.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

lDDQD posted:

Lol, Vega 20 is Radeon VII. Reading that link, it looks like they datamined some configuration files from a beta driver or something; and then go on to make the conclusion that another APU will have a Navi 21, which is equally ridiculous. More likely, is these APUs simply share the same version of IP blocks found in Vega 20; Navi 21 (respectively), and the driver will execute that code path.

I hosed up, I completely forgot about the Radeon VII's codename and your theory makes more sense.

I guess Cezanne is just going to be Renoir but with Zen3, I don't see how the CPU can budge since it'll still be using Vega20 IP. Maybe it can use some IP from RDNA2 like improved color compression but you don't need to up CU count because 8CU @ 1.8 or 2.1Ghz is still bottlenecked by DDR4 4266.

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug

Seamonster posted:

I wonder how much of that $600 cost (to consumer) is the DDR4 FRICKIN 4266. I can't even find SO-DIMMS faster than 3200 on newegg. That is some blazing fast RAM yeesh.

Probably so they can use the Vega chipset to its fullest to make it cheaper than added a dedicated GPU.

Cygni
Nov 12, 2005

raring to post

Seamonster posted:

I wonder how much of that $600 cost (to consumer) is the DDR4 FRICKIN 4266. I can't even find SO-DIMMS faster than 3200 on newegg. That is some blazing fast RAM yeesh.

I'm pretty certain that is LPDDR4X in its single channel die config for space/power reasons.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Of course I get to be the person with the nightmare scenario everyone dreamed of, my X570 fan has a bad bearing or something. This is an ASRock Taichi that I've had for 8 months. I think I will just have the fan curve utility start up automatically and keep it turned off rather than RMA the board over this.

ijyt
Apr 10, 2012

Are BIOS updates still needed for B450 boards? I have a friend wanting to build a new gaming PC and I suggested a 3600 on a B450, but if he needs to flash the bios then that's gonna be a bit of a dealbreaker.

Klyith
Aug 3, 2007

GBS Pledge Week

ijyt posted:

Are BIOS updates still needed for B450 boards? I have a friend wanting to build a new gaming PC and I suggested a 3600 on a B450, but if he needs to flash the bios then that's gonna be a bit of a dealbreaker.

No, with possible exceptions for low-volume ITX boards that aren't popular.

Molten Llama
Sep 20, 2006

ijyt posted:

Are BIOS updates still needed for B450 boards? I have a friend wanting to build a new gaming PC and I suggested a 3600 on a B450, but if he needs to flash the bios then that's gonna be a bit of a dealbreaker.

Highly unlikely, but even then a decent chunk of B450 boards can be flashed from USB with nothing installed.

Anime Schoolgirl
Nov 28, 2002

https://www.anandtech.com/show/15953/amd-zen-now-at-6w-tdp-dual-core-for-education

quote:

The Lenovo 100e 2nd Gen will use this new chip, Windows 10, Wi-Fi 6, 64 GB eMMC, 4 GB DDR4, an 11.6-inch 13x7 display (250 nits), but offer a hard wearing design suitable for bumps and scrapes as well as ~12 hours of battery life, with quick charging providing 80% power in an hour.

The Lenovo 300e 2nd Gen is a similar build but offers a 360-degree hinge, pen support, with an optional 128 GB SSD. Battery is 42 Wh, rated at ~12 hours.

The 100e will start at $219 and the 300e will start at $299, available from September. Both devices have a variety of student-focused software options focusing on teaching and security.
the shitbook laptop segment might finally find salvation from the hell that is Atom


...nah intel's still gonna sell those for $5 and atom laptops will still be inexplicably more expensive than these

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Anime Schoolgirl posted:

https://www.anandtech.com/show/15953/amd-zen-now-at-6w-tdp-dual-core-for-education

the shitbook laptop segment might finally find salvation from the hell that is Atom


...nah intel's still gonna sell those for $5 and atom laptops will still be inexplicably more expensive than these

dang I'd love to replace my i3-5005u with one of those athlons

SwissArmyDruid
Feb 14, 2014

by sebmojo
Adored is still absolute and complete bollocks, right? Just checking, because you might want to take an entire can of Morton's for this. (For you Diamond fans, two cans.)

quote:

In our most recent video, we reported Raja Koduri was going to be leaving Intel within Q3 (this quarter), which is to say by the end of September. However, many of you will have read Intel’s recent announcement which states that Koduri will continue to lead Intel’s graphics division. Our sources have since reiterated to us that Koduri will still be leaving, and the proof is actually right in Intel’s announcement.

[...]

Finally, our Taiwanese sources say Intel will eventually cancel Xe and dissolve the graphics division. [Edit: To be clear, “the graphics division” refers to the team Raja Koduri assembled to work on discrete graphics like Xe HP and HPC, not to integrated graphics (such as Xe LP or Gen 12) which existed well before Raja Koduri was even hired.] We can’t be quite sure when this will happen, but given DG3’s cancelation and its prior launch target of 2023, 2023 could be the year Xe finally ends. The reason for Xe’s cancelation is just down to money. It has cost Intel about $500 million to fund Arctic Sound and DG1 (with Arctic Sound taking the lion’s share of that sum of money) and these graphics projects have yielded few results. Bob Swan is a financially focused CEO and he will seek to start cutting Intel’s lowest margin and least profitable ventures. Xe is first on the chopping block.

https://adoredtv.com/exclusive-arctic-sound-family-ponte-vecchio-and-the-future-of-intel-graphics/

Hmm. Maybe *two* cans of Morton.

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug

Anime Schoolgirl posted:

https://www.anandtech.com/show/15953/amd-zen-now-at-6w-tdp-dual-core-for-education

the shitbook laptop segment might finally find salvation from the hell that is Atom


...nah intel's still gonna sell those for $5 and atom laptops will still be inexplicably more expensive than these

Seems like Atom already got pushed out by Intel's Pentium and Celeron brands anyways, Atom seems to be relegated to SBCs and smaller now.

Adbot
ADBOT LOVES YOU

Anime Schoolgirl
Nov 28, 2002

CommieGIR posted:

Seems like Atom already got pushed out by Intel's Pentium and Celeron brands anyways, Atom seems to be relegated to SBCs and smaller now.
pentium silver and celeron silver are based on atom cores, which is what i am referring to

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply