Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Zarin
Nov 11, 2008

I SEE YOU

Cactus posted:

Yeah this is me. I'm convinced, from just looking around and seeing what's happening, that civilisation as we know it is coming to an end in the next decade or so

Eh, I'm sure someone will keep the light of civilization around. I probably just won't be allowed to live there

Adbot
ADBOT LOVES YOU

Warmachine
Jan 30, 2012



Happy_Misanthrope posted:

New Nvidia marketing campaign

You don't know what tomorrow will bring, so bring home the peace of mind of a 3090ti Titan SUPER today for the low low price of your entire retirement fund.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Sid Meier is like 66 so I think we've got at least another 10 years of civilization.

Warmachine
Jan 30, 2012



mobby_6kl posted:

Sid Meier is like 66 so I think we've got at least another 10 years of civilization.

If he's like Tom Clancy, we'll be getting another 10-20 on top of that from his ghost.

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC
Looks like Nvidia is interested in buying ARM from Softbank.

https://www.reuters.com/article/us-...SKCN24N2P1?il=0

CaptainSarcastic
Jul 6, 2013



OhFunny posted:

Looks like Nvidia is interested in buying ARM from Softbank.

https://www.reuters.com/article/us-...SKCN24N2P1?il=0

That's... interesting. I'm not quite awake enough to fully process what that could mean.

Cygni
Nov 12, 2005

raring to post

OhFunny posted:

Looks like Nvidia is interested in buying ARM from Softbank.

https://www.reuters.com/article/us-...SKCN24N2P1?il=0

Daaaaaaaaaaaaamn. would be wild if that happened.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

CaptainSarcastic posted:

That's... interesting. I'm not quite awake enough to fully process what that could mean.

Well, NVidia might finally be able to dick over Apple for refusing to use NVidia GPUs for the last decade, which would be hilarious.

More realistically, I'd see it as another move from them to try to get deeper into the data center by being able to provide full custom compute solutions based on their GPUs + ARM CPUs.

NewFatMike
Jun 11, 2015

I was just about to comment the same. That would be super wild from a SoC perspective. If you're licensing an ISA from a subsidiary, you probably get *real* good deals.

After AMD's next cycle of tripping on their own dicks, they'll get acquired by the RISC-V Foundation to maintain balance :v:

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
nVidia has a higher valuation than Intel right now. It's not out of the realm of possibility that they're trying to pull a Cyberdyne.

ufarn
May 30, 2009
Who needs their dumb cars, give me this, Nvidia

https://www.youtube.com/watch?v=xcgVztdMrX4

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

DrDork posted:

Well, NVidia might finally be able to dick over Apple for refusing to use NVidia GPUs for the last decade, which would be hilarious.
What sort of license agreement does Apple have anyway? If Softbank is trying to get rid of ARM Holdings, I wouldn't be surprised if it went to Apple by "surprise". Of course, that'd suck even more, considering how they proceed with their acquisitions.

Cygni
Nov 12, 2005

raring to post

They have full ISA access, and the rumor at the time was that it was basically a forever agreement with set payments that only they could terminate.

Apple more or less created ARM with Acorn in the first place, and contributed to the R&D and design, so the deal is likely reflective of that.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Nvidia + ARM, eh? So what you're saying is, in 5 years the standard unit of computing will be the Nintendo Switch.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

K8.0 posted:

Nvidia + ARM, eh? So what you're saying is, in 5 years the standard unit of computing will be the Nintendo Switch.

Imagine a Beowulf cluster of those!

shrike82
Jun 11, 2005

Nvidia has been selling neat ARM-Nvidia GPU embedded devices for a while with a focus on AI inferencing.
Not much market traction for whatever reason.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

K8.0 posted:

Nvidia + ARM, eh? So what you're saying is, in 5 years the standard unit of computing will be the Nintendo Switch.

some kind of NVIDIA Unit of Computing? :thunk:

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Cygni posted:

Apple more or less created ARM with Acorn in the first place, and contributed to the R&D and design, so the deal is likely reflective of that.

You're probably right, but I can't help but be amused to see Apple just announce that they're going all-in on ARM, only to have it be potentially bought up by one of the few companies in the tech world that they absolutely refuse to do business with.

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum
I don't really follow Apple stuff but I like Nvidia products, although a friend worked there a couple of years ago and said he hated it. Anyway, why does Apple not do business with them?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

redreader posted:

I don't really follow Apple stuff but I like Nvidia products, although a friend worked there a couple of years ago and said he hated it. Anyway, why does Apple not do business with them?

apple got butthurt when the first generation of RoHS-compliant solder turned out to be brittle and failed over time and demanded NVIDIA pay for all of the recalls

it was an industry-wide problem, that's when the phenomenon of "baking your GPU" to reflow the solder took off (including on AMD GPUs) but Apple wanted NVIDIA to pay for all the recalls and NVIDIA refused.

Apple's a difficult customer too, they see themselves as the 800 pound gorilla and if you don't cater to them then they'll find someone else who will. Plus they are strongly wary of third-party ecosystems - they don't want their customers to be locked into any other walled gardens but apple's, so openCL appealed to them over CUDA.

Paul MaudDib fucked around with this message at 02:29 on Jul 23, 2020

Zarin
Nov 11, 2008

I SEE YOU

Paul MaudDib posted:

Apple's a difficult customer too, they see themselves as the 800 pound gorilla and if you don't cater to them then they'll find someone else who will.

Yeah, that sounds familiar. I've worked for a company that really liked to swing its dick around; I was legit perplexed why some suppliers would even bother to do business with us.

shrike82
Jun 11, 2005

It's business. You hear stories about Nvidia trying to dick around with TSMC and getting its rear end reamed, and of course Intel over the years.

Warmachine
Jan 30, 2012



Zarin posted:

Yeah, that sounds familiar. I've worked for a company that really liked to swing its dick around; I was legit perplexed why some suppliers would even bother to do business with us.

It's because Biggus Dickus pays twices as much as all of our smaller clients combined. The company I work for has a few of these. At least for us these customers act as basically an R&D fund: they pay us to make something, we make it, then turn around and sell it to everyone else.

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

Paul MaudDib posted:

apple got butthurt when the first generation of RoHS-compliant solder turned out to be brittle and failed over time and demanded NVIDIA pay for all of the recalls

it was an industry-wide problem, that's when the phenomenon of "baking your GPU" to reflow the solder took off (including on AMD GPUs) but Apple wanted NVIDIA to pay for all the recalls and NVIDIA refused.


I seem to remember more episodes of Nvidia and Apple having problems, seems they have a long history

https://www.pcgamesn.com/nvidia/nvidia-apple-driver-support

Also didn’t Nvidia screw Apple at one of their events? Or they announced something ahead of time, which Apple hates

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Zarin posted:

Yeah, that sounds familiar. I've worked for a company that really liked to swing its dick around; I was legit perplexed why some suppliers would even bother to do business with us.

I've worked now at two separate places where I explained Walmart is Walmart because they win every single vendor deal, no one is happy working with them. Both times we ended up majorly regretting working with them and had a really nasty break-up where they definitely had the upper hand and took advantage.

Freakazoid_
Jul 5, 2013


Buglord

Cactus posted:

When can we expect some actual news? I'm getting impatient and these rumours are just that- rumours.

:agreed:

I'm itchin' for a whole new pc and I need to know what I'm missing out on buying parts too soon.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Freakazoid_ posted:

:agreed:

I'm itchin' for a whole new pc and I need to know what I'm missing out on buying parts too soon.

The "thing" about the current AMD CPUs, even though they've got Intel beat on cores, is that at least in single-threaded performance, the 3950X is about the same as a 7700K. Which sounds bad, until you realize the 9900K is only ~15% faster (at best) than the 7700K and the 10900K is only ~5% faster than the 9900K.

CPUs have hit a wall 'speed-wise' until the next big breakthrough. If the next-gen Ryzen release was going to match or exceed Intel, AMD would have leaked as much by now. Expect 8700K-level single-threaded performance with more cores.

~That being said~, next gen consoles will be all about using their 8C16T CPUs, so it'll probably be a good idea to use that as a benchmark. My guess is the top-end next-gen Ryzen consumer part will be a 22C44T part, because they want companies to keep buying Threadripper for workstations.

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!
Interesting. Thank you!

CaptainSarcastic
Jul 6, 2013



I got my Ryzen 5 3600X with the plan that once AMD had definitively said their next CPUs would be AM5 that I would jump up to a faster-clocked higher core/thread CPU of their last iteration for AM4. With their history I'm thinking that might end up being like a 4800XT or 4850XT or 5800XT or whatever if they do something like Zen 3+ or otherwise have a mid-cycle refresh of some sort.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Comfy Fleece Sweater posted:

I seem to remember more episodes of Nvidia and Apple having problems, seems they have a long history

https://www.pcgamesn.com/nvidia/nvidia-apple-driver-support

Also didn’t Nvidia screw Apple at one of their events? Or they announced something ahead of time, which Apple hates

the major historical (i.e. something that would have affected the swap in 2011 or whatever) claims I see are:

quote:

It claims this may be all down to the various poorly executed deals and blows between the two companies that go back over a decade. Apple was forced to delay a product due to Nvidia’s issues producing the GeForce 6800; Nvidia then was to blame for faulty MacBook Pros; Intel got angry about Nvidia’s tech; Apple subsequently went to AMD for help… yada yada, cut back to 2019, and Apple and Nvidia no longer get along.

(a) nvidia delayed a gpu this one time in 2007 because of production issues

(b) bumpgate (apple whines about RoHS solder failing)

(c) "intel got mad about NVIDIA's tech": not sure but maybe this? i.e. Intel was suing NVIDIA for making third-party intel compatible motherboard chipsets (completely legal at the time and widespread 15-30 years ago), and apple got mad about that?

not seeing how that's not a bitchy big-dick client story.

inb4 "but charlie demerjian said..."

Paul MaudDib fucked around with this message at 06:04 on Jul 23, 2020

Cygni
Nov 12, 2005

raring to post

Apple is generally a huge volume, zero profit client for a parts supplier too. AMD has needed those sorts of deals the last few years, nvidia has not.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Cygni posted:

Apple is generally a huge volume, zero profit client for a parts supplier too. AMD has needed those sorts of deals the last few years, nvidia has not.

it's also high-prestige, relatively. Your hardware gets developed to by a bunch of people who built the full stack themselves on BSD. Your hardware is not going to look better on other platforms than it does on MacOS. That's a win for AMD too.

Even at zero profit that makes it a winner for an underdog. A big client is paying for your R&D direction and for you to add this cool brand-specific feature. And nowadays you'd make that play with the intention of getting acquired.

Nothing I've ever seen indicates that Apple isn't a big-dick client, demanding but kinda low-margin but high prestige and if you don't do their thing then they'll swap providers and your firm loses 25% of its gross revenue overnight.

Seeing Apple pick up the ability to design the whole stack, competently, in like the last 10 years and overwhelmingly efficient/performant within the last 5 years , has been really impressive. They're done with other people's components, A13X will probably compete with mobile skylake or renoir, and they own almost everything under it too.

The missing parts for a win for them are x86 access (either directly or via emulated library like microsoft did with the surface ARMs), and potentially owning their own fab. Imagine Apple buys some poo poo fab, and dumps $10b into it, fabs are expensive but not all that expensive in terms of Apple's cash reserves (largest on the planet). They have the cash for research too.

Paul MaudDib fucked around with this message at 06:26 on Jul 23, 2020

shrike82
Jun 11, 2005

Ultimately a loss on Nvidia's part given the lack of support for Nvidia cards on any platform on Apple hardware.
Nvidia's done pretty poorly overall in the mobile market.

CaptainSarcastic
Jul 6, 2013



Paul MaudDib posted:

Seeing Apple pick up the ability to do the whole stack, competently, in like the last 10 years and overwhelmingly efficient/performant within the last 5 years , has been really impressive. They're done with other people's components, A13X will probably compete with mobile skylake or renoir, and they own almost everything under it too.

Wait, what? Do you own Apple stock or something?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

CaptainSarcastic posted:

Wait, what? Do you own Apple stock or something?

Apple now designs their own ARM cores in house, they're fantastically better than everything else on the market.

https://www.anandtech.com/show/14892/the-apple-iphone-11-pro-and-max-review/2

quote:

By far the biggest change on the SoC level has been the new system level cache (SLC). Already last year Apple had made huge changes to this block as it had adopted a new microarchitecture and increased the size from 4MB to 8MB. This year, Apple is doubling down on the SLC and it’s very evidently using a new 16MB configuration across the four slices. A single SLC slice without the central arbitration block increases by 69% - and the actual SRAM macros seen on the die shot essentially double from a total of 3.20mm² to 6.36mm².

The amount of SRAM that Apple puts on the A13 is staggering, especially on the CPU side: We’re seeing 8MB on the big cores, 4MB on the small cores, and 16MB on the SLC which can serve all IP blocks on the chip.

quote:

The large cores for this generation are called “Lightning” and are direct successors to last year’s Vortex microarchitecture. In terms of the core design, at least in regards to the usual execution units, we don’t see too much divergence from last year’s core. The microarchitecture at its heart is still a 7-wide decode front-end, paired with a very wide execution back-end that features 6 ALUs and three FP/vector pipelines.

Apple hasn’t made any substantial changes to the execution back-end, as both Lightning and Vortex are largely similar to each other. The notable exception to this is the complex integer pipelines, where we do see improvements. Here the two multiplier units are able to shave off one cycle of latency, dropping from 4 cycles to 3. Integer division has also seen a large upgrade as the throughput has now been doubled and latency/minimum number of cycles has been reduced from 8 to 7 cycles.

Another change in the integer units has been a 50% increase in the number of ALU units which can set condition flags; now 3 of the ALUs can do this, which is up from 2 in A12's Vortex.

As for the floating point and vector/SIMD pipelines, we haven't noticed any changes there.

In terms of caches, Apple seems to have kept the cache structures as they were in the Vortex cores of the A12. This means we have 8-way associative 128KB L1 instruction and data caches. The data cache remains very fast with a 3-cycle load-to-use latency. The shared L2 cache between the cores continues to be 8MB in size, however Apple has reduced the latency from 16 to 14 cycles, something we’ll be looking at in more detail on the next page when looking at the memory subsystem changes.

A big change to the CPU cores which we don’t have very much information on is Apple’s integration of “machine learning accelerators” into the microarchitecture. At heart these seem to be matrix-multiply units with DSP-like instructions, and Apple puts their performance at up to 1 Tera Operations (TOPs) of throughput, claiming an up-to 6x increase over the regular vector pipelines. This AMX instruction set is seemingly a superset of the ARM ISA that is running on the CPU cores.

I could go on, but. A13 is incredibly impressive. They have around 70% higher IPC than Intel iirc, it's wild, and I say that as a tech enthusiast. They just max out at 2.5 GHz or whatever, but they are almost performance-competitive at that speed because they just dump shitloads of silicon on performance and efficiency, effectively the classic "low clocks, lots of cache, wide execution units" formula. They put as much cache as a 9900K on their "2 big / 4 LITTLE" design. A single A13 core can access as much cache as a 9900K, it's loving nuts in mobile terms. They don't care because the cost of designing it in-house amortizes out for them.

The efficiency cores are great although less impressive than the performance cores. The graphics cores seem fine and they guarantee low-level hardware support because you're on their hardware.

They own their own software stack, the OS is BSD but with a completely apple optimized kernel and userland. If they introduce some new capability, they just use it, no hardware problems. Browser does native assembly calls on some hot path, whatever, they know the guy who wrote it. That's one of the reasons the iOS userland is consistently faster and lower-power than the android userland.

Seriously, throw an efficient translation layer and possibly native compiled kernel in there and it'll run great, that can easily tie skylake mobile I bet. If they do the "interpreted userland / native kernel" I bet it'll be usable on Windows let alone OSX.

From a long-term strategic perspective they are only missing x86 compatibility (nice to have but they have their own ecosystem) and maybe building it in-house (if they feel like challenging TSMC, which they easily could do in another decade or 5y, they have the cash, they could buy TSMC outright if it was for sale).

The Solaris strategy actually works well. Sad but true.

Paul MaudDib fucked around with this message at 06:54 on Jul 23, 2020

..btt
Mar 26, 2008

Cygni posted:

Apple more or less created ARM with Acorn in the first place, and contributed to the R&D and design, so the deal is likely reflective of that.

I'm sure you mean the spun-off company not the architecture, but it always amuses me how apple manages to get credit for inventing everything about 10 years after it starts existing.

SwissArmyDruid
Feb 14, 2014

by sebmojo

BIG HEADLINE posted:

nVidia has a higher valuation than Intel right now. It's not out of the realm of possibility that they're trying to pull a Cyberdyne.

I mean, AMD can also claim to have a higher per-share stock price than Intel right now, too. poo poo's wild.

DrDork posted:

Well, NVidia might finally be able to dick over Apple for refusing to use NVidia GPUs for the last decade, which would be hilarious.

More realistically, I'd see it as another move from them to try to get deeper into the data center by being able to provide full custom compute solutions based on their GPUs + ARM CPUs.

It also probably bears reminding that Nvidia also bought Mellanox. They are singularly positioned to be a one-stop-shop for server compute.

shrike82
Jun 11, 2005

I don't know how much of a moat Nvidia has with compute especially ML. Google came out with its own cloud ML hardware and in three years has hardware faster than Nvidia's enterprise-facing solutions. You can see a similar thing happening with Apple making claims about speed-ups in ML inferencing on their own silicon. ML comes down to lots of matmuls so it's easier to design and build around.

More broadly, Nvidia's currently selling the hardware but the various providers (Amazon, Google, Microsoft) are the ones capturing metered revenue. It's going to be easier for cloud providers to rollout their own hardware than it is for Nvidia to break into the cloud market.

Cygni
Nov 12, 2005

raring to post

..btt posted:

I'm sure you mean the spun-off company not the architecture, but it always amuses me how apple manages to get credit for inventing everything about 10 years after it starts existing.

Yeah, I’m talkin the modern ARM ISA lineage from arm6 on, not like the Berkeley RISC and early Acorn stuff. Admittedly I don’t know much about the really early stuff cause I never really had a reason to look into it.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Not to reignite the "nvidia is a software company that sells middleware for their hardware" debate but:

I think a lot of people misunderstand Apple, they're a software company. The hardware is incidental. They just do it because they can do it better and it's a good long-term investment to do it. This is a 10s-100s billions bet.

So is NVIDIA. They don't care if the DGX-3whatever sells 10,000 units vs 100k units. poo poo, they'll pay to write your neural net software if you'll pay them unit cost for 100k units. They're selling software to enterprise customers, and here's the hardware that will do that best.

And what they sell to consumers is... drivers that don't crash every 30 minutes on your new GPU. And DLSS. Software.

There is a reason Intel used to sell in-house servers, and build their own compiler, and have their own MKL math libraries, etc etc.

Paying for the first million dollar hamburger is a very profitable overall strategy. The building costs $1m plus $1, the next burger is $1 for the customer, and you make the profits in the long term. "Nobody is going to pay to write good software for your chips except you".

Paul MaudDib fucked around with this message at 07:33 on Jul 23, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply