Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us $3,400 per month for bandwidth bills alone, and since we don't believe in shoving popup ads to our registered users, we try to make the money back through forum registrations.
«283 »
  • Post
  • Reply
Cygni
Nov 12, 2005

raring to post



mdxi posted:

I'm really, really hoping for a 65W part with 12 cores, as that would give me 48c/96t across my tiny compute farm.

i mean you can turn the inevitable 12 core Ryzen 3 into a 65w part in the bios. i wouldnt expect very impressive clocks though

Adbot
ADBOT LOVES YOU

Anime Schoolgirl
Nov 28, 2002

~perfect~
battlebrother





you can run 14nm ryzen at 8 cores at 3.0ghz drawing just 35w on load from the CPU itself

that clockspeed is pretty impressive at that power draw

Seamonster
Apr 30, 2007

IMMER SIEGREICH


Yeaaaaahh passive cool that shiznit

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Wedge Regret

I'll take more than 12 cores, of course. And since the R7 2700 is a 65W part, it seems almost inevitable that there will be at least a 12 core part in that same envelope, given the move to 7nm.

I'm just keeping my expectations on the low/realistic side.

PC LOAD LETTER
May 23, 2005
WTF?!

Slippery Tilde

Yeah a ~3Ghz 12C/24T 40-65W Zen2 is probably doable if Zen and Zen+ are anything to go by.

It should only be if you want lotsa (12C+) cores & high (~4.5Ghz+) clocks that Zen2 will become furnace like and will probably only be somewhat less difficult than the ~5Ghz i9900K's.

Maaayyybbbeee having the cores and IO spread across 2-3 dies + a better soldered IHS vs Intel's monolithic die + so-so soldered IHS approach with the i9900K will help with cooling? At least in terms of moderating the peak temp so that you're less heat limited even if you still got lots of heat to get rid of. I'd expect to be still looking at cooling somewhere north of 150W with a 12C/24T Zen2 @ 4.5Ghz-ish though so not exactly easy even if its not too hard.

The 8C/16T Zen2 parts should be much easier to deal with in comparison in terms of necessary cooling and power draw even when overclocked a fair amount which will be a nice upgrade path for most I bet.

Bloody Antlers
Mar 27, 2010


Anyone expecting GloFo to license a 7nm manufacturing process from Samsung next year?

IIRC, GF was going to have to renegotiate with Samsung and AMD alike in 2020 anyway. Halting their process in 2018 seems like something that makes sense if they knew they were not going to be able to have a comparable 7nm node online in time for the negotiations, as it would be much better to approach Samsung as a partner than as a competitor in that scenario.

GloFo doesn't seem to be in a bad spot - 12nm & 14nm demand will be profitable to service for years to come, and having the true state of their own 7nm process development being an unknown, Samsung should prefer to license their polished IP to GloFo and reap the considerable profit that comes from that, because who else are they going to license it to at a higher price?

wargames
Mar 16, 2008

official yospos cat censor


Bloody Antlers posted:

Anyone expecting GloFo to license a 7nm manufacturing process from Samsung next year?

IIRC, GF was going to have to renegotiate with Samsung and AMD alike in 2020 anyway. Halting their process in 2018 seems like something that makes sense if they knew they were not going to be able to have a comparable 7nm node online in time for the negotiations, as it would be much better to approach Samsung as a partner than as a competitor in that scenario.

GloFo doesn't seem to be in a bad spot - 12nm & 14nm demand will be profitable to service for years to come, and having the true state of their own 7nm process development being an unknown, Samsung should prefer to license their polished IP to GloFo and reap the considerable profit that comes from that, because who else are they going to license it to at a higher price?

GloFo is part of the samsung/ibm/common alliance or whatever and should get their tech from that right?

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.


Taco Defender

Bloody Antlers posted:

Anyone expecting GloFo to license a 7nm manufacturing process from Samsung next year?

IIRC, GF was going to have to renegotiate with Samsung and AMD alike in 2020 anyway. Halting their process in 2018 seems like something that makes sense if they knew they were not going to be able to have a comparable 7nm node online in time for the negotiations, as it would be much better to approach Samsung as a partner than as a competitor in that scenario.

GloFo doesn't seem to be in a bad spot - 12nm & 14nm demand will be profitable to service for years to come, and having the true state of their own 7nm process development being an unknown, Samsung should prefer to license their polished IP to GloFo and reap the considerable profit that comes from that, because who else are they going to license it to at a higher price?

I thought that the main reason GF dropped 7nm was because the investment fund that owns them was sick of sinking money into new fabs right as the old ones were just getting paid off? If that was the case then I don't think they'll bother to invest in 7nm unless it looks like demand for 14/12nm would start dropping off which I don't think would be for quite a while.

Klyith
Aug 3, 2007

GBS Pledge Week


GloFo had the tech ready, but the only person in line for pre-orders was AMD. All their other customers were like, nah we'll stick with cheaper 12nm thanks.

AMD alone doesn't do enough volume to make the massive investment of a whole process line pay off -- that's why they spun off Global Floundaries in the first place. But that means AMD is at the mercy of industry demand, and the industry evidently doesn't have that much need for bleeding edge 7nm stuff.

PC LOAD LETTER
May 23, 2005
WTF?!

Slippery Tilde

Its probably more correct to say they have a use for 7nm they just can't afford to pay the price.

Seamonster
Apr 30, 2007

IMMER SIEGREICH


Klyith posted:

the industry evidently doesn't have that much need for bleeding edge 7nm stuff.

x86 stuff sure but dat mobile SoC space tho

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!


Seamonster posted:

x86 stuff sure but dat mobile SoC space tho

Not even then. Apple didn't get much out of moving from 10nm to 7nm to really justify the cost besides the claim for bleeding edge tech required of them. I can see most SoC designs stopping at TSMC and GloFo 12nm, which I imagine both will continually refine until their close enough to 10nm to not matter, and they'll be perfectly adequate for 90% of user needs.

I mean, honest to gods, what is the iPhone XS doing that the iPhone 8 or X can't do? How do these features represent a significant upgrade?

Sidesaddle Cavalry
Mar 15, 2013

65535

dispel please


The upgrade is you, you bought a new iPhone so you upgraded your fashion and that will surely make other people like you

VostokProgram
Feb 20, 2014



It should help with battery life if nothing else. Batteries aren't getting much better and the bigger, brighter, higher resolution screens suck more and more energy. I'm sure somebody somewhere wants to put HDR on a phone too

JazzmasterCurious
May 20, 2006
Few people know the living legend JazzmasterCurious.

Also smartwatches, where Apple has the only really good SoC.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!


EmpyreanFlux posted:

Not even then. Apple didn't get much out of moving from 10nm to 7nm to really justify the cost besides the claim for bleeding edge tech required of them. I can see most SoC designs stopping at TSMC and GloFo 12nm, which I imagine both will continually refine until their close enough to 10nm to not matter, and they'll be perfectly adequate for 90% of user needs.

I mean, honest to gods, what is the iPhone XS doing that the iPhone 8 or X can't do? How do these features represent a significant upgrade?

The Kirin 980 got good performance and efficiency gains going to 7nm in Anandtech's testing, this carries over into better battery life for devices with the Kirin 980 as well.

MaxxBot fucked around with this message at Feb 11, 2019 around 08:23

Arzachel
May 12, 2012


MaxxBot posted:

The Kirin 980 got good performance and efficiency gains going to 7nm in Anandtech's testing, this carries over into better battery life for devices with the Kirin 980 as well.



Looking at the review, I feel the screen is pulling more weight, especially since the pcmark bench is much closer.

Klyith
Aug 3, 2007

GBS Pledge Week


MaxxBot posted:

The Kirin 980 got good performance and efficiency gains going to 7nm in Anandtech's testing, this carries over into better battery life for devices with the Kirin 980 as well.

The very anandtech article you're citing also mentions that the previous Huawei was on out-of-date 16FFC and had lousy power efficiency compared to competing chips. So yes skipping a couple generations makes a big difference, but it's worthless when talking 7nm vs current alternatives.



Anyways when GlobalFoundaries cancelled 7nm they were saying that the thing they were concentrating on was 12FDX, which is a very low-power process -- though it isn't a high-performance one. So for consumer mobile that will be a cheap way to get the same battery life advantage.

7mn is great and all, but people aren't paying for luxx phones every year these days. Just look at apple's recent performance.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!


VostokProgram posted:

It should help with battery life if nothing else. Batteries aren't getting much better and the bigger, brighter, higher resolution screens suck more and more energy. I'm sure somebody somewhere wants to put HDR on a phone too

A quick look at GSMArena indicates the increase in battery life between models is...negative, the iPhone X has marginally better battery life than the XS. In a comparison to the XS Max the total difference in talk time is very minimal relative to battery size, correct me if I am wrong.

I mean I don't doubt power/perf went up it's a new node but the question is, who is really taking advantage of that perf? Luxury phones are kind of butting heads with laptop performance these days but phones are a lovely enough interface you don't really want to be doing laptop things with them, and what you are doing with them doesn't demand that bleeding edge performance.

SwissArmyDruid
Feb 14, 2014



Yeah, it's not the processor that's the major consumer of power in phones these days, it's the display.

VostokProgram
Feb 20, 2014



No that's exactly my point. The display sucks more and more power but the battery can only be so big, so it's the CPU and other electronics that need to get more efficient.

E: in other words, making your silicon more power efficient isn't something you do to make battery life go up, it's something you do to make it stay the same or go down less

Bloody Antlers
Mar 27, 2010


Thanks for the clarification!

Unfortunately, it seems that they are having issues with the 12FDX page itself.


EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!


VostokProgram posted:

No that's exactly my point. The display sucks more and more power but the battery can only be so big, so it's the CPU and other electronics that need to get more efficient.

E: in other words, making your silicon more power efficient isn't something you do to make battery life go up, it's something you do to make it stay the same or go down less

I mean I think we're talking past here when what I'm saying is that 7nm is only necessary for luxury phones and 16/14/12/10nm designs are cheaper and get you 80-90% of the performance at much reduced cost. If your argument is that it's necessary for luxury phones because they're constantly trying to push a performance envelope then I don't think we disagree, I merely think that the market for that is going to be very small and won't justify a major shift to 7nm across the board.

EmpyreanFlux fucked around with this message at Feb 13, 2019 around 18:53

PC LOAD LETTER
May 23, 2005
WTF?!

Slippery Tilde

7nm for phone SoC's makes sense when you consider the malware and adware that has to run all the time on them in the background eating up CPU time and the batteries.

Sidesaddle Cavalry
Mar 15, 2013

65535

dispel please


PC LOAD LETTER posted:

7nm for phone SoC's makes sense when you consider the malware and adware that has to run all the time on them in the background eating up CPU time and the batteries.

I get where you're coming from, but blame the groups who design that poo poo for the most commercialized electronic platform to date. The desktop is technically older and people hadn't fully figured out how to exploit it for profit back around the turn of the century, and a lot of people who post on the internet about things like this have a great deal of nostalgia for those days

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast


VostokProgram posted:

No that's exactly my point. The display sucks more and more power but the battery can only be so big

Tell that to the Ulefone power 5s http://ulefone.com/products/power5s/features.html

PC LOAD LETTER
May 23, 2005
WTF?!

Slippery Tilde

Sidesaddle Cavalry posted:

but blame the groups who design that poo poo for the most commercialized electronic platform to date

Oh yeah sure but I wasn't really trying to go there.

My post was more of a response to those who seemed to think 7nm phone SoC's were almost totally pointless and/or a niche use at best in thread.

I personally don't like AT ALL that the adware/malware situation is so bad it'd be a somewhat valid reason (since its the path of least resistance to improving battery time and phone performance) to improve the phone hardware but its not something the average end user or hardware OEM guys can seem to do much or anything else about.

edit: \/\/\/\/\/\/\/\/ context was 7nm phone SoC's not 7nm Ryzen gaaah\/\/\/\/\/\/\/\/

PC LOAD LETTER fucked around with this message at Feb 13, 2019 around 10:17

Klyith
Aug 3, 2007

GBS Pledge Week


PC LOAD LETTER posted:

My post was more of a response to those who seemed to think 7nm phone SoC's were almost totally pointless and/or a niche use at best in thread.

I started this derail when by saying "need" when I should have said "demand". I repent my mistake.


Don't think anyone thinks 7nm is pointless, just that the expense is higher than the benefits for most consumer phones in 2019. We're all excited for 7nm Ryzen after all!

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



Nap Ghost


Those things are... suspiciously inexpensive. I'm guessing you'd never get an upgrade off of the already hilariously out of date OS they ship, but that battery life has to be someone's niche demand.

PC LOAD LETTER
May 23, 2005
WTF?!

Slippery Tilde

Its because its using a Mediatek chipset which tend to be fairly cost optimized.

The specs for the chips usually sound pretty good on paper, and give a good showing in synth benches, but in the real world performance is usually mediocre compared to Qualcomm's stuff and software support is usually poor at best.

Actual battery capacity is probably BS too though it'll probably still have a lot more capacity than most.

My family bought lots of those Chinese phones and I gave some a shot too and it was always the same with Mediatek despite the hype.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.

Toilet Rascal

Munkeymon posted:

I'm guessing you'd never get an upgrade off of the already hilariously out of date OS they ship, but that battery life has to be someone's niche demand.

Well Android 9.0 is the newest it ships with 8.1, and 9.0 has known battery life issues last time I checked. So going with 8.1 makes the most sense since it will give the biggest battery life possible, which seems to be the design goal.

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



Nap Ghost

pixaal posted:

Well Android 9.0 is the newest it ships with 8.1, and 9.0 has known battery life issues last time I checked. So going with 8.1 makes the most sense since it will give the biggest battery life possible, which seems to be the design goal.

There's a version of Android that nobody claims has battery life issues?!

PC LOAD LETTER posted:

Actual battery capacity is probably BS too though it'll probably still have a lot more capacity than most.

I wonder about the quality of the battery, too, at that price. Though if that capacity isn't a lie, it should take quite a lot of abuse before it loses significant capacity.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.

Toilet Rascal

Munkeymon posted:

There's a version of Android that nobody claims has battery life issues?!

There's apparently a 10-20% drop in battery life when you upgrade a device from 8.x to 9.x it's pretty significant.

Stanley Pain
Jun 16, 2001

Bit. Trip. RIP.


Munkeymon posted:

Those things are... suspiciously inexpensive. I'm guessing you'd never get an upgrade off of the already hilariously out of date OS they ship, but that battery life has to be someone's niche demand.

It might have something to do with the 13,000 mAh battery in there. That is a beastly battery.

fishmech
Jul 16, 2006

~death to capitalism~
Upgrade your OS
Homestuck Rules



Salad Prong

pixaal posted:

There's apparently a 10-20% drop in battery life when you upgrade a device from 8.x to 9.x it's pretty significant.

This sure as poo poo didn't happen with my pixel 2 xl.

SwissArmyDruid
Feb 14, 2014



Intel sure can't catch a break. Did Zen2-based EPYC add any similar features?

https://arstechnica.com/gadgets/201...virus-software/

PC LOAD LETTER
May 23, 2005
WTF?!

Slippery Tilde

SwissArmyDruid posted:

Intel sure can't catch a break. Did Zen2-based EPYC add any similar features?
Don't know about Zen2 but Zen and Zen+ Epyc supports SME (no secure enclaves or Intel-like approved whitelist needed but will encrypt all or chunks of the memory as needed) and SEV (each VM gets its own crypto key) which isn't the same thing as SGX, and doesn't seem to be able to be misused to as thoroughly own a machine as SGX can be, though is supposed to meet the goal of more security as needed.

Whether or not SME/SEV are considered to be as effective over all as SGX is security-wise I don't know enough to judge. Reading that article a little more it seems like its considered a "better than nothing but not good or great" security improvement.

Loots more info here: https://lwn.net/Articles/686808/

edit: there was an attack you could pull on AMD's PSP a while back but IIRC it required physical access to the machine to pull off, I think they did a fix for it in one of the AGESA's late last year or early this year. \/\/\/\/\/\/\/\/\/

PC LOAD LETTER fucked around with this message at Feb 14, 2019 around 14:30

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.


Taco Defender

SME and SEV are both managed by an ARM Cortex-A5 CPU embedded in the main x86 CPU which could in theory be attacked in a similar manner, though it would probably be much harder since AMD doesn't allow custom code to run on the ARM core unlike Intel who apparently allow people to supply their own code (so long as it's signed/approved by Intel).

NewFatMike
Jun 11, 2015



I know it's been tossed around, but did anyone ever make a general CPU architecture thread? I'd love to hear folks smarter than me talk about RISC-V, MIPS, and new ARM things like their new Neoverse platfrom:

https://www.anandtech.com/show/1395...se-n1-platform/

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014



If you're using a Ryzen Mobile processor, That One GPU Driver Update We've Been Waiting For is finally out.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply
«283 »