|
Cactus posted:Yeah this is me. I'm convinced, from just looking around and seeing what's happening, that civilisation as we know it is coming to an end in the next decade or so Eh, I'm sure someone will keep the light of civilization around. I probably just won't be allowed to live there
|
# ? Jul 22, 2020 15:58 |
|
|
# ? Apr 29, 2024 10:33 |
|
Happy_Misanthrope posted:New Nvidia marketing campaign You don't know what tomorrow will bring, so bring home the peace of mind of a 3090ti Titan SUPER today for the low low price of your entire retirement fund.
|
# ? Jul 22, 2020 16:14 |
|
Sid Meier is like 66 so I think we've got at least another 10 years of civilization.
|
# ? Jul 22, 2020 16:15 |
|
mobby_6kl posted:Sid Meier is like 66 so I think we've got at least another 10 years of civilization. If he's like Tom Clancy, we'll be getting another 10-20 on top of that from his ghost.
|
# ? Jul 22, 2020 16:16 |
|
Looks like Nvidia is interested in buying ARM from Softbank. https://www.reuters.com/article/us-...SKCN24N2P1?il=0
|
# ? Jul 22, 2020 20:40 |
|
OhFunny posted:Looks like Nvidia is interested in buying ARM from Softbank. That's... interesting. I'm not quite awake enough to fully process what that could mean.
|
# ? Jul 22, 2020 20:58 |
|
OhFunny posted:Looks like Nvidia is interested in buying ARM from Softbank. Daaaaaaaaaaaaamn. would be wild if that happened.
|
# ? Jul 22, 2020 21:12 |
|
CaptainSarcastic posted:That's... interesting. I'm not quite awake enough to fully process what that could mean. Well, NVidia might finally be able to dick over Apple for refusing to use NVidia GPUs for the last decade, which would be hilarious. More realistically, I'd see it as another move from them to try to get deeper into the data center by being able to provide full custom compute solutions based on their GPUs + ARM CPUs.
|
# ? Jul 22, 2020 21:32 |
|
I was just about to comment the same. That would be super wild from a SoC perspective. If you're licensing an ISA from a subsidiary, you probably get *real* good deals. After AMD's next cycle of tripping on their own dicks, they'll get acquired by the RISC-V Foundation to maintain balance
|
# ? Jul 22, 2020 21:50 |
|
nVidia has a higher valuation than Intel right now. It's not out of the realm of possibility that they're trying to pull a Cyberdyne.
|
# ? Jul 22, 2020 22:01 |
|
Who needs their dumb cars, give me this, Nvidia https://www.youtube.com/watch?v=xcgVztdMrX4
|
# ? Jul 22, 2020 22:14 |
|
DrDork posted:Well, NVidia might finally be able to dick over Apple for refusing to use NVidia GPUs for the last decade, which would be hilarious.
|
# ? Jul 22, 2020 22:39 |
|
They have full ISA access, and the rumor at the time was that it was basically a forever agreement with set payments that only they could terminate. Apple more or less created ARM with Acorn in the first place, and contributed to the R&D and design, so the deal is likely reflective of that.
|
# ? Jul 22, 2020 22:46 |
|
Nvidia + ARM, eh? So what you're saying is, in 5 years the standard unit of computing will be the Nintendo Switch.
|
# ? Jul 22, 2020 23:37 |
|
K8.0 posted:Nvidia + ARM, eh? So what you're saying is, in 5 years the standard unit of computing will be the Nintendo Switch. Imagine a Beowulf cluster of those!
|
# ? Jul 22, 2020 23:54 |
|
Nvidia has been selling neat ARM-Nvidia GPU embedded devices for a while with a focus on AI inferencing. Not much market traction for whatever reason.
|
# ? Jul 23, 2020 00:30 |
|
K8.0 posted:Nvidia + ARM, eh? So what you're saying is, in 5 years the standard unit of computing will be the Nintendo Switch. some kind of NVIDIA Unit of Computing?
|
# ? Jul 23, 2020 00:35 |
|
Cygni posted:Apple more or less created ARM with Acorn in the first place, and contributed to the R&D and design, so the deal is likely reflective of that. You're probably right, but I can't help but be amused to see Apple just announce that they're going all-in on ARM, only to have it be potentially bought up by one of the few companies in the tech world that they absolutely refuse to do business with.
|
# ? Jul 23, 2020 00:54 |
|
I don't really follow Apple stuff but I like Nvidia products, although a friend worked there a couple of years ago and said he hated it. Anyway, why does Apple not do business with them?
|
# ? Jul 23, 2020 02:13 |
|
redreader posted:I don't really follow Apple stuff but I like Nvidia products, although a friend worked there a couple of years ago and said he hated it. Anyway, why does Apple not do business with them? apple got butthurt when the first generation of RoHS-compliant solder turned out to be brittle and failed over time and demanded NVIDIA pay for all of the recalls it was an industry-wide problem, that's when the phenomenon of "baking your GPU" to reflow the solder took off (including on AMD GPUs) but Apple wanted NVIDIA to pay for all the recalls and NVIDIA refused. Apple's a difficult customer too, they see themselves as the 800 pound gorilla and if you don't cater to them then they'll find someone else who will. Plus they are strongly wary of third-party ecosystems - they don't want their customers to be locked into any other walled gardens but apple's, so openCL appealed to them over CUDA. Paul MaudDib fucked around with this message at 02:29 on Jul 23, 2020 |
# ? Jul 23, 2020 02:18 |
|
Paul MaudDib posted:Apple's a difficult customer too, they see themselves as the 800 pound gorilla and if you don't cater to them then they'll find someone else who will. Yeah, that sounds familiar. I've worked for a company that really liked to swing its dick around; I was legit perplexed why some suppliers would even bother to do business with us.
|
# ? Jul 23, 2020 02:37 |
|
It's business. You hear stories about Nvidia trying to dick around with TSMC and getting its rear end reamed, and of course Intel over the years.
|
# ? Jul 23, 2020 03:01 |
|
Zarin posted:Yeah, that sounds familiar. I've worked for a company that really liked to swing its dick around; I was legit perplexed why some suppliers would even bother to do business with us. It's because Biggus Dickus pays twices as much as all of our smaller clients combined. The company I work for has a few of these. At least for us these customers act as basically an R&D fund: they pay us to make something, we make it, then turn around and sell it to everyone else.
|
# ? Jul 23, 2020 03:09 |
|
Paul MaudDib posted:apple got butthurt when the first generation of RoHS-compliant solder turned out to be brittle and failed over time and demanded NVIDIA pay for all of the recalls I seem to remember more episodes of Nvidia and Apple having problems, seems they have a long history https://www.pcgamesn.com/nvidia/nvidia-apple-driver-support Also didn’t Nvidia screw Apple at one of their events? Or they announced something ahead of time, which Apple hates
|
# ? Jul 23, 2020 03:12 |
|
Zarin posted:Yeah, that sounds familiar. I've worked for a company that really liked to swing its dick around; I was legit perplexed why some suppliers would even bother to do business with us. I've worked now at two separate places where I explained Walmart is Walmart because they win every single vendor deal, no one is happy working with them. Both times we ended up majorly regretting working with them and had a really nasty break-up where they definitely had the upper hand and took advantage.
|
# ? Jul 23, 2020 03:27 |
|
Cactus posted:When can we expect some actual news? I'm getting impatient and these rumours are just that- rumours. I'm itchin' for a whole new pc and I need to know what I'm missing out on buying parts too soon.
|
# ? Jul 23, 2020 05:12 |
|
Freakazoid_ posted:
The "thing" about the current AMD CPUs, even though they've got Intel beat on cores, is that at least in single-threaded performance, the 3950X is about the same as a 7700K. Which sounds bad, until you realize the 9900K is only ~15% faster (at best) than the 7700K and the 10900K is only ~5% faster than the 9900K. CPUs have hit a wall 'speed-wise' until the next big breakthrough. If the next-gen Ryzen release was going to match or exceed Intel, AMD would have leaked as much by now. Expect 8700K-level single-threaded performance with more cores. ~That being said~, next gen consoles will be all about using their 8C16T CPUs, so it'll probably be a good idea to use that as a benchmark. My guess is the top-end next-gen Ryzen consumer part will be a 22C44T part, because they want companies to keep buying Threadripper for workstations.
|
# ? Jul 23, 2020 05:25 |
|
K8.0 posted:stuff
|
# ? Jul 23, 2020 05:33 |
|
I got my Ryzen 5 3600X with the plan that once AMD had definitively said their next CPUs would be AM5 that I would jump up to a faster-clocked higher core/thread CPU of their last iteration for AM4. With their history I'm thinking that might end up being like a 4800XT or 4850XT or 5800XT or whatever if they do something like Zen 3+ or otherwise have a mid-cycle refresh of some sort.
|
# ? Jul 23, 2020 05:42 |
|
Comfy Fleece Sweater posted:I seem to remember more episodes of Nvidia and Apple having problems, seems they have a long history the major historical (i.e. something that would have affected the swap in 2011 or whatever) claims I see are: quote:It claims this may be all down to the various poorly executed deals and blows between the two companies that go back over a decade. Apple was forced to delay a product due to Nvidia’s issues producing the GeForce 6800; Nvidia then was to blame for faulty MacBook Pros; Intel got angry about Nvidia’s tech; Apple subsequently went to AMD for help… yada yada, cut back to 2019, and Apple and Nvidia no longer get along. (a) nvidia delayed a gpu this one time in 2007 because of production issues (b) bumpgate (apple whines about RoHS solder failing) (c) "intel got mad about NVIDIA's tech": not sure but maybe this? i.e. Intel was suing NVIDIA for making third-party intel compatible motherboard chipsets (completely legal at the time and widespread 15-30 years ago), and apple got mad about that? not seeing how that's not a bitchy big-dick client story. inb4 "but charlie demerjian said..." Paul MaudDib fucked around with this message at 06:04 on Jul 23, 2020 |
# ? Jul 23, 2020 05:50 |
|
Apple is generally a huge volume, zero profit client for a parts supplier too. AMD has needed those sorts of deals the last few years, nvidia has not.
|
# ? Jul 23, 2020 06:05 |
|
Cygni posted:Apple is generally a huge volume, zero profit client for a parts supplier too. AMD has needed those sorts of deals the last few years, nvidia has not. it's also high-prestige, relatively. Your hardware gets developed to by a bunch of people who built the full stack themselves on BSD. Your hardware is not going to look better on other platforms than it does on MacOS. That's a win for AMD too. Even at zero profit that makes it a winner for an underdog. A big client is paying for your R&D direction and for you to add this cool brand-specific feature. And nowadays you'd make that play with the intention of getting acquired. Nothing I've ever seen indicates that Apple isn't a big-dick client, demanding but kinda low-margin but high prestige and if you don't do their thing then they'll swap providers and your firm loses 25% of its gross revenue overnight. Seeing Apple pick up the ability to design the whole stack, competently, in like the last 10 years and overwhelmingly efficient/performant within the last 5 years , has been really impressive. They're done with other people's components, A13X will probably compete with mobile skylake or renoir, and they own almost everything under it too. The missing parts for a win for them are x86 access (either directly or via emulated library like microsoft did with the surface ARMs), and potentially owning their own fab. Imagine Apple buys some poo poo fab, and dumps $10b into it, fabs are expensive but not all that expensive in terms of Apple's cash reserves (largest on the planet). They have the cash for research too. Paul MaudDib fucked around with this message at 06:26 on Jul 23, 2020 |
# ? Jul 23, 2020 06:17 |
|
Ultimately a loss on Nvidia's part given the lack of support for Nvidia cards on any platform on Apple hardware. Nvidia's done pretty poorly overall in the mobile market.
|
# ? Jul 23, 2020 06:17 |
|
Paul MaudDib posted:Seeing Apple pick up the ability to do the whole stack, competently, in like the last 10 years and overwhelmingly efficient/performant within the last 5 years , has been really impressive. They're done with other people's components, A13X will probably compete with mobile skylake or renoir, and they own almost everything under it too. Wait, what? Do you own Apple stock or something?
|
# ? Jul 23, 2020 06:26 |
|
CaptainSarcastic posted:Wait, what? Do you own Apple stock or something? Apple now designs their own ARM cores in house, they're fantastically better than everything else on the market. https://www.anandtech.com/show/14892/the-apple-iphone-11-pro-and-max-review/2 quote:By far the biggest change on the SoC level has been the new system level cache (SLC). Already last year Apple had made huge changes to this block as it had adopted a new microarchitecture and increased the size from 4MB to 8MB. This year, Apple is doubling down on the SLC and it’s very evidently using a new 16MB configuration across the four slices. A single SLC slice without the central arbitration block increases by 69% - and the actual SRAM macros seen on the die shot essentially double from a total of 3.20mm² to 6.36mm². quote:The large cores for this generation are called “Lightning” and are direct successors to last year’s Vortex microarchitecture. In terms of the core design, at least in regards to the usual execution units, we don’t see too much divergence from last year’s core. The microarchitecture at its heart is still a 7-wide decode front-end, paired with a very wide execution back-end that features 6 ALUs and three FP/vector pipelines. I could go on, but. A13 is incredibly impressive. They have around 70% higher IPC than Intel iirc, it's wild, and I say that as a tech enthusiast. They just max out at 2.5 GHz or whatever, but they are almost performance-competitive at that speed because they just dump shitloads of silicon on performance and efficiency, effectively the classic "low clocks, lots of cache, wide execution units" formula. They put as much cache as a 9900K on their "2 big / 4 LITTLE" design. A single A13 core can access as much cache as a 9900K, it's loving nuts in mobile terms. They don't care because the cost of designing it in-house amortizes out for them. The efficiency cores are great although less impressive than the performance cores. The graphics cores seem fine and they guarantee low-level hardware support because you're on their hardware. They own their own software stack, the OS is BSD but with a completely apple optimized kernel and userland. If they introduce some new capability, they just use it, no hardware problems. Browser does native assembly calls on some hot path, whatever, they know the guy who wrote it. That's one of the reasons the iOS userland is consistently faster and lower-power than the android userland. Seriously, throw an efficient translation layer and possibly native compiled kernel in there and it'll run great, that can easily tie skylake mobile I bet. If they do the "interpreted userland / native kernel" I bet it'll be usable on Windows let alone OSX. From a long-term strategic perspective they are only missing x86 compatibility (nice to have but they have their own ecosystem) and maybe building it in-house (if they feel like challenging TSMC, which they easily could do in another decade or 5y, they have the cash, they could buy TSMC outright if it was for sale). The Solaris strategy actually works well. Sad but true. Paul MaudDib fucked around with this message at 06:54 on Jul 23, 2020 |
# ? Jul 23, 2020 06:35 |
|
Cygni posted:Apple more or less created ARM with Acorn in the first place, and contributed to the R&D and design, so the deal is likely reflective of that. I'm sure you mean the spun-off company not the architecture, but it always amuses me how apple manages to get credit for inventing everything about 10 years after it starts existing.
|
# ? Jul 23, 2020 06:36 |
|
BIG HEADLINE posted:nVidia has a higher valuation than Intel right now. It's not out of the realm of possibility that they're trying to pull a Cyberdyne. I mean, AMD can also claim to have a higher per-share stock price than Intel right now, too. poo poo's wild. DrDork posted:Well, NVidia might finally be able to dick over Apple for refusing to use NVidia GPUs for the last decade, which would be hilarious. It also probably bears reminding that Nvidia also bought Mellanox. They are singularly positioned to be a one-stop-shop for server compute.
|
# ? Jul 23, 2020 06:37 |
|
I don't know how much of a moat Nvidia has with compute especially ML. Google came out with its own cloud ML hardware and in three years has hardware faster than Nvidia's enterprise-facing solutions. You can see a similar thing happening with Apple making claims about speed-ups in ML inferencing on their own silicon. ML comes down to lots of matmuls so it's easier to design and build around. More broadly, Nvidia's currently selling the hardware but the various providers (Amazon, Google, Microsoft) are the ones capturing metered revenue. It's going to be easier for cloud providers to rollout their own hardware than it is for Nvidia to break into the cloud market.
|
# ? Jul 23, 2020 06:50 |
|
..btt posted:I'm sure you mean the spun-off company not the architecture, but it always amuses me how apple manages to get credit for inventing everything about 10 years after it starts existing. Yeah, I’m talkin the modern ARM ISA lineage from arm6 on, not like the Berkeley RISC and early Acorn stuff. Admittedly I don’t know much about the really early stuff cause I never really had a reason to look into it.
|
# ? Jul 23, 2020 06:52 |
|
|
# ? Apr 29, 2024 10:33 |
|
Not to reignite the "nvidia is a software company that sells middleware for their hardware" debate but: I think a lot of people misunderstand Apple, they're a software company. The hardware is incidental. They just do it because they can do it better and it's a good long-term investment to do it. This is a 10s-100s billions bet. So is NVIDIA. They don't care if the DGX-3whatever sells 10,000 units vs 100k units. poo poo, they'll pay to write your neural net software if you'll pay them unit cost for 100k units. They're selling software to enterprise customers, and here's the hardware that will do that best. And what they sell to consumers is... drivers that don't crash every 30 minutes on your new GPU. And DLSS. Software. There is a reason Intel used to sell in-house servers, and build their own compiler, and have their own MKL math libraries, etc etc. Paying for the first million dollar hamburger is a very profitable overall strategy. The building costs $1m plus $1, the next burger is $1 for the customer, and you make the profits in the long term. "Nobody is going to pay to write good software for your chips except you". Paul MaudDib fucked around with this message at 07:33 on Jul 23, 2020 |
# ? Jul 23, 2020 07:08 |