Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
If you're happy with the Switch experience you could play at low settings and cap your framerates really aggressively to keep the load down. Depending on how the thermal management is set up that may keep the fans under control. But yeah in terms of pure convenience nothing is remotely close to a Switch for portable gaming. Portable versions of high-performance computing parts are never going to compete on efficiency with actual mobile parts.

Adbot
ADBOT LOVES YOU

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!

DrDork posted:

Not sure which iteration of Razer's laptops you were using, but the current Stealth 15 starts with a 2070 and 144Hz or 4k OLED, which...uh....is considerably different from what you can get out of a MBP, regardless of price. Their keyboards are take it or leave it, though, and their trackpads try to be MBP ones but don't quite measure up. I dig the black aluminum frame, though.

If you want to say you like the experience of a MBP better because you like MacOS better than Windows, that's totally legit. Razer's biggest issue vs a MBP is that they tend to have shorter lives due to eventual failures, which is lovely considering the prices, but while they work they're pretty great for a gaming laptop, which is 100% what they are.

a lot of computers (and phones) that win on bullet-point specs with poo poo like “real 4K resolution” on a notebook and/or OLED (or frankly touchscreens for the first couple years of Win8/early Win10) end up being way jankier in practice.

I see this firsthand on Android phones with high/variable refresh OLED displays. Flagship phone screens straight up changing brightness and color behavior when switching refresh rate. Totally inconsistent off-axis performance. Weird color shifting at different brightness levels. Meanwhile everybody gave Apple poo poo for a hot second for shipping “just another LCD screen” on the iPhone 10R, never mind that it went on to be rated best-in-class for its price bracket in terms of day-to-day QoL. And this isn’t an isolated thing, LCDs on Apple devices routinely outperform OLEDs on similarly priced products from competitors.

I’ve used a few Razer notebooks in the last decade and they always feel like 10-20% off from the bar set by a comparably priced MBP. Like, they’re clearly aiming for the same target but they can’t get the fit+finish dialed in that last bit.

trilobite terror fucked around with this message at 05:05 on Aug 4, 2020

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
A decent number of 5775C seem to have hit the surplus market in China and are now on eBay at the $90-120 range. Arguably not a bad deal if you’re looking to upgrade a 4690K system or whatever, if you can catch it bouncing down to $90 again and flip your 4690K for like 50 bucks.

I ordered one when they were first listed and it showed up today, haven’t had a chance to install it but it externally appears to be a normal, non-ES chip just as advertised.

I’m really curious where these would ever have been used such that they’re now finding their way onto the surplus market. Someone speculated to me maybe an Apple product, perhaps the iMac Pro? But even then you would think they would have used the 5775R and not the socketed version. I really didn’t know that anyone ever used socketed 5775Cs in any significant quantity in fact.

Paul MaudDib fucked around with this message at 05:56 on Aug 4, 2020

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

Paul MaudDib posted:

A decent number of 5775C seem to have hit the surplus market in China and are now on eBay at the $90-120 range. Arguably not a bad deal if you’re looking to upgrade a 4690K system or whatever, if you can catch it bouncing down to $90 again and flip your 4690K for like 50 bucks.

I ordered one when they were first listed and it showed up today, haven’t had a chance to install it but it externally appears to be a normal, non-ES chip just as advertised.

I’m really curious where these would ever have been used such that they’re now finding their way onto the surplus market. Someone speculated to me maybe an Apple product, perhaps the iMac Pro? But even then you would think they would have used the 5775R and not the socketed version. I really didn’t know that anyone ever used socketed 5775Cs in any significant quantity in fact.

That's really cool. I always wanted one of those because they had that extra L4 cache and I was curious if it'd help at all while gaming, but not enough to spend money on it at this point. My 4670K has been a trooper but I upgraded past it a little while ago.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Rexxed posted:

That's really cool. I always wanted one of those because they had that extra L4 cache and I was curious if it'd help at all while gaming, but not enough to spend money on it at this point. My 4670K has been a trooper but I upgraded past it a little while ago.

I still have a Z97 mITX system hanging around with a 4690K, and this will be an interesting use for it. Not gonna drop $300 on it like they used to go for, but for $90 (minus whatever I can get for the old CPU) it’ll be a fun toy.

It’s a gigabyte board so it also should be fairly painless to do hackintosh if I want.

Sadly the rest of the build itself isn’t worth a whole lot anymore, so bumping it to 4C8T and using it as a secondary PC for whatever is probably the best I can do with it.

Paul MaudDib fucked around with this message at 09:26 on Aug 4, 2020

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
They used to be highly rated for frame pacing because of the edram but it's old hat nowadays.
I remember a big fuss being kicked up about them just about edging past the 6700k by a knife's edge in games.
Of course it was like a day later that the whole core count thing exploded.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Ok Comboomer posted:

a lot of computers (and phones) that win on bullet-point specs with poo poo like “real 4K resolution” on a notebook and/or OLED (or frankly touchscreens for the first couple years of Win8/early Win10) end up being way jankier in practice.

You can say what you want about phones, but the 4k OLED's in the Razers are gorgeous, 144Hz is hilariously nicer than 60hz (though you're stuck with 1080p at that point for :reasons: for another ~6 months), and the dGPU options aren't even worth talking about because the MBP hasn't had more than a mid-tier GPU in a decade. So, yeah, at any even vaguely similar price point, the MBP better have superior fit-and-finish, because it's not winning on performance.

And that's fine: Razer makes premium gaming laptops, and that means they're working in a bit of a different market than the MBP, and are selling to people who are willing to accept different sets of trade-offs.

Fame Douglas
Nov 20, 2013

by Fluffdaddy
It's not my primary gaming machine obviously, but I find you can play a surprising amount games on any halfway modern integrated graphics with settings turned down. I've even been using my small Surface Pro 7 for WoW from time to time.

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!

DrDork posted:

and the dGPU options aren't even worth talking about because the MBP hasn't had more than a mid-tier GPU in a decade

5600M is pretty ridiculous, but I’m definitely not paying $800 for one

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Ok Comboomer posted:

5600M is pretty ridiculous, but I’m definitely not paying $800 for one

It's roughly equivalent to a 2060, which pretty much defines "mid-tier" GPUs right now. While it's not too shabby, it's also a considerable way behind what I'd call "ridiculous." Though asking $800 for it is pretty crazy, yeah. But so is asking $700+ for a 1TB SSD like Lenovo does, so :shrug:

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Ok Comboomer posted:

5600M is pretty ridiculous, but I’m definitely not paying $800 for one

You can buy complete laptops with a 5600M for $800, I saw the Dell G5 going for that with a 6C/12T Ryzen 4600H.

Coffee Jones
Jul 4, 2004

16 bit? Back when we was kids we only got a single bit on Christmas, as a treat
And we had to share it!
Currently have an 8700k.
Am I fine just ignoring anything CPU related until DDR 5 comes out? Like if I were to wait until I wanted to buy new everything for a generational upgrade, would DDR 5 be a good line to draw?

Khorne
May 1, 2002

Coffee Jones posted:

Currently have an 8700k.
Am I fine just ignoring anything CPU related until DDR 5 comes out? Like if I were to wait until I wanted to buy new everything for a generational upgrade, would DDR 5 be a good line to draw?
You're fine ignoring everything until DDR5 comes out in 2021-2022. Maybe even longer.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Yes. 8700K/9900K and 3600/3700X/3900X are perfectly fine holding out until DDR5. You might even want to wait until the second generation of DDR5 when everything is mature. The next 6 months are going to be a significant incremental improvement (the first since the 8700K really) but nothing you won’t die without.

Think Sandy Bridge -> Ivy Bridge type transition. Better? Yes. Can't live without? No.

8700K is still the top performer for gaming basically. Delid and/or overclock if you’re feeling the need for speed and you haven't already. For productivity, the 3900X and threadripper exist but you'll know if you're doing things that need it.

Paul MaudDib fucked around with this message at 02:35 on Aug 5, 2020

Khorne
May 1, 2002

Paul MaudDib posted:

Think Sandy Bridge -> Ivy Bridge type transition. Better? Yes. Can't live without? No.
That transition was marred a bit by clock speed differences. And Ivy being the first enthusiast-level consumer Intel CPU that used tim instead of solder so it ran blazin hot.

Coffee Jones
Jul 4, 2004

16 bit? Back when we was kids we only got a single bit on Christmas, as a treat
And we had to share it!
I have this trusty old Latitude E6430 and the i7-3720QM is a little hand dryer, even after replacing the fans and heat sink and disabling the Nvidia GPU ... I guess that’s why.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Coffee Jones posted:

Currently have an 8700k.
Am I fine just ignoring anything CPU related until DDR 5 comes out? Like if I were to wait until I wanted to buy new everything for a generational upgrade, would DDR 5 be a good line to draw?

Unless you have an actual need for more cores for work, wait

For gaming? You already basically have the best gaming cpu

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.
Since we were talking about Macs before I find this video very interesting:
https://www.youtube.com/watch?v=jzT0-t-7-PA

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

punk rebel ecks posted:

Since we were talking about Macs before I find this video very interesting:
https://www.youtube.com/watch?v=jzT0-t-7-PA

isn't there a Mac Pro that runs with a Ryzen/Threadripper CPU? I could almost swear LTT did one a while back

Anime Schoolgirl
Nov 28, 2002

gradenko_2000 posted:

isn't there a Mac Pro that runs with a Ryzen/Threadripper CPU? I could almost swear LTT did one a while back
it's a hackintosh

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!

gradenko_2000 posted:

isn't there a Mac Pro that runs with a Ryzen/Threadripper CPU? I could almost swear LTT did one a while back

Yep, the guy in the video built one a few months back too.

Edit: he’s built a bunch of them apparently...at different budgets...using laptops....etc.

https://youtu.be/AXg9sMuGxB0

trilobite terror fucked around with this message at 15:28 on Aug 6, 2020

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.
Would be interesting to see how much the optimization advantage would suffice using a Hackintosh. That video doesn't compare first party Apple software.

punk rebel ecks fucked around with this message at 16:25 on Aug 6, 2020

Dr. Fishopolis
Aug 31, 2004

ROBOT
Yeah, it's sort of a dumb excuse. "Apple can afford to sell computers with otherwise bad price/performance because the software is so optimized!"

"but uh... what if you optimized the software and also sold us a computer that was actually a good deal?"

*surprised pikachu face*

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

punk rebel ecks posted:

Since we were talking about Macs before I find this video very interesting:
https://www.youtube.com/watch?v=jzT0-t-7-PA

The gotcha with all of these videos is that the only way you get to a $50k mac is by stuffing it full of RAM sticks that are each like $2k even on the aftermarket let alone once they're sourced through an OEM, and none of the systems they're building are even capable of supporting that much RAM in the first place. If you build a truly equivalent workstation with 1.5 TB of RAM it's still gonna be at least $30k, let alone once you add the markup of going through an OEM.

picking "$11k mac" as a specific number is done specifically because that's the largest amount of RAM that the "competing PC" will support, so on the apple side you are taking a dump truck and racing it against a ferrari on the PC side. Now benchmark that $11k PC with 256GB of memory on an in-memory workload with a 1.5 TB working set and watch it run thousands of times slower than the mac.

Go ask HP to spec you a 1.5 TB workstation with one of those new TRX80 motherboards (they support RDIMM/LRDIMM so they can hit the memory capacities) and see how much that comes out to. The number will be not terribly dissimilar to that $50k mac. Probably faster for the price, but the $11k machine you put together in your garage isn't remotely comparable here.

If you're just running cinebench - yeah who cares, you didn't need a $50k workstation in the first place. That's not why you'd be buying a workstation with 1.5 TB of RAM.

Paul MaudDib fucked around with this message at 18:54 on Aug 6, 2020

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.

Dr. Fishopolis posted:

Yeah, it's sort of a dumb excuse. "Apple can afford to sell computers with otherwise bad price/performance because the software is so optimized!"

"but uh... what if you optimized the software and also sold us a computer that was actually a good deal?"

*surprised pikachu face*

Didn't Apple use to do this when they had PowerPC which is why they dominated the media creation world?

Paul MaudDib posted:

The gotcha with all of these videos is that the only way you get to a $50k mac is by stuffing it full of RAM sticks that are each like $2k even on the aftermarket let alone once they're sourced through an OEM, and none of the systems they're building are even capable of supporting that much RAM in the first place. If you build a truly equivalent workstation with 1.5 TB of RAM it's still gonna be at least $30k, let alone once you add the markup of going through an OEM.

picking "$11k mac" as a specific number is done specifically because that's the largest amount of RAM that the "competing PC" will support, so on the apple side you are taking a dump truck and racing it against a ferrari on the PC side. Now benchmark that $11k PC with 256GB of memory on an in-memory workload with a 1.5 TB working set and watch it run thousands of times slower than the mac. wow, guess PC sucks doesn't it :smuggo:

Go ask HP to spec you a 1.5 TB workstation with one of those new TRX80 motherboards (they support RDIMM/LRDIMM so they can hit the memory capacities) and see how much that comes out to. The number will be not terribly dissimilar to that $50k mac. Probably faster for the price, but the $11k machine you put together in your garage isn't remotely comparable here.

If you're just running cinebench - yeah who cares, you didn't need a $50k workstation in the first place. That's not why you'd be buying a workstation with 1.5 TB of RAM.

I'm too stupid to understand this.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

punk rebel ecks posted:

I'm too stupid to understand this.

Mac Pro supports up to 1.5 TB of RAM because it has RDIMM support (different type of memory that allows a much larger capacity per stick). Memory is expensive so actually populating this system to its maximum capacity is extremely expensive and that's how you get to ludicrous numbers like a "$50k mac".

The PCs they're building are consumer tier gear that doesn't support RDIMMs, thus it only supports 256 GB of RAM. But it has a lot more cores, so if what you are doing isn't particularly memory-heavy it will be a lot faster for a lot less money.

But, if you tried to put that PC with 256GB of RAM into a workload that needed the workstation's 1.5 TB of RAM, it would suck rear end because it would be constantly paging out to disk, it would be thousands of times slower. The consumer build just can't do some things that the workstation can handle easily, even though it's got more cores and faster cores. These comparisons are always specifically only testing the workloads that the consumer rig can do, and the workstation is of course not quite as good in performance terms and offers terrible value for those tasks. You're putting a ferrari up against a dump truck in a race - of course the ferrari wins, unless the test is to see how fast you can move 50 tons of dirt, in which case the ferrari doesn't do so well.

Additionally, ordering things from OEMs is just more expensive in general. That 128GB stick of RDIMM memory is $2k if you go and buy it yourself, but if you buy it from HP it's going to be double that. Apple is no different. It's like building a PC yourself vs buying a prebuilt, the prebuilt is always worse value (but also has support bundled in/etc).

Net effect: if you spec out a comparable workstation that can handle 1.5 TB of RAM from a competing vendor, it's going to be a bit cheaper, or somewhat faster for the same money. But a lot of that "$50k" mac is actually just the fact that they've maxed out the RAM, and RAM is really expensive (>$2k per stick not from an OEM). There is probably $30-40k of RAM in that $50k mac workstation.

A comparable machine is not going to be $11k from another vendor, it is going to be $40k, or $50k and you get a build that will be better than the Mac Pro.

Paul MaudDib fucked around with this message at 19:10 on Aug 6, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

Mac Pro supports up to 1.5 TB of RAM because it has RDIMM support (different type of memory that allows a much larger capacity per stick). Memory is expensive so actually populating this system to its maximum capacity is extremely expensive and that's how you get to ludicrous numbers like a "$50k mac".

Yeah, as a random point of reference, a Dell Precision 7920, which supports RDIMMs, will cost you $24,300 just for 1.5TB of RAM. If you really want to get silly, it supports up to 3TB of the stuff--for $91,000 (plus the rest of the computer, including a minimum ~$4000 Xeon CPU).

Depending on what CPU you feel like selecting to pair with the 1.5TB RAM, the Dell comes out pretty similarly priced in the end.

Dr. Fishopolis
Aug 31, 2004

ROBOT
I just watched that video, where are you getting that $50k comparison? From that one comment at the start? Nothing he's doing seems relevant to your post, I'm super confused. Those machines only have 96gb of ram installed.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dr. Fishopolis posted:

I just watched that video, where are you getting that $50k comparison? From that one comment at the start? Nothing he's doing seems relevant to your post, I'm super confused. Those machines only have 96gb of ram installed.

when the mac pro was launched there was a flood of "omg $50k mac! I can build a threadripper for like $4k!" and I'm responding to that as much as this specific video. The point is that's not a fair comparison, that mac spec has a lot of capabilities that the threadripper doesn't, namely having six times the RAM.

if you are only installing 96GB in that system you are wasting it and you didn't need a workstation in the first place. You can put 128GB on a 3950X if you want. And of course taking a workstation and only populating it to AM4 level capacity is an absolutely terrible value proposition, regardless of brand. That is what I'm specifically complaining about.

the reason this specific video is $11k is that he's setting up the tests to be something that the consumer PC will be capable of handling. But the workstation can be populated much higher and handle tasks that the consumer PC can't, and that is the reason you buy the workstation, not because it's faster at small consumer-level cinebench/cinema 4D workloads.

It's not really an intellectually honest video, they're setting up a pre-determined conclusion by using a workstation for consumer workloads, and unsurprisingly it ends up being a horrible value proposition. And it's trendy to dunk on Apple (which is justified, their PC hardware sucks) but in this specific case you could substitute any other OEM and end up with an equally bad result - like DrDork says, populating 1.5TB on a Dell is $25k just for the memory alone, depending on the CPU you end around $50k just like with the Mac Pro, and taking that Dell and putting 96GB on it would be a horrible waste of money too.

The Mac Pro is still a bad deal but it's like "$50k for a machine you can get for $45k elsewhere" level bad, or "$11k for a machine that would cost me $9k elsewhere" level bad, not "$11k mac dunked on by $2k pc". People are just making clickbait videos to cash in on apple hate, the machines they're building aren't comparable, the tasks they're doing are just limited enough not to show the difference.

Paul MaudDib fucked around with this message at 20:01 on Aug 6, 2020

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull
DrDork's comment about what CPU one selects is very relevant here. In the Xeon W series, Intel uses support for >1TB memory as a market segmentation thing and charges $3000 for it. For example, look at the differences (highlighted in blue) between these two nearly-identical models:

https://ark.intel.com/content/www/us/en/ark/compare.html?productIds=193752,193754

Apple uses these "M" suffix Xeons for their 24 and 28 core CPU options, so if you want lots of cores, you're paying $3000 for a fuse bit even if you don't put in lots of RAM.

I'm sure this kind of game has nothing to do with why they're switching


(actually if that was Intel's only fault I'm sure Apple would still be happy. They're not a workstation company, they could happily live with some bullshit in that segment. The real problem, I assume, is their bread and butter: laptops.)

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

BobHoward posted:

Apple uses these "M" suffix Xeons for their 24 and 28 core CPU options, so if you want lots of cores, you're paying $3000 for a fuse bit even if you don't put in lots of RAM.

I'm sure this kind of game has nothing to do with why they're switching

That would seem to be a specific and intentional decision by Apple--Dell lets you select from like 40 different CPUs, rather than 5. More likely, they figure that if you're buying a Mac Workstation that needs 24/28 cores, you're doing that because you're going to fill it up with RAM, because what would you be doing with that many cores otherwise?

But yeah, I think you're right and their real grump against Intel is related to delays in 10/7nm and what that means for their laptops. Though AMD is now also saying they're delaying their laptop chips 2 months, so apparently no one can deliver mobile chips in volume right now.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

DrDork posted:

That would seem to be a specific and intentional decision by Apple--Dell lets you select from like 40 different CPUs, rather than 5.

Well, yeah - Apple isn't Dell, they want to keep the option list as small as possible. One could argue this isn't the right thing for a workstation, but this is a place where Apple is just going to do its thing and pare down the set of options they present.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

BobHoward posted:

Well, yeah - Apple isn't Dell, they want to keep the option list as small as possible. One could argue this isn't the right thing for a workstation, but this is a place where Apple is just going to do its thing and pare down the set of options they present.

True. I mean, at the end of the day, Apple isn't Dell in another important way: they're not competing directly with really anyone else. You buy a Mac Pro because you want a Mac workstation for whatever reason (specific software, integration into your existing environment, etc), and if that means that you end up buying more machine than you strictly need because a more exact fit option isn't available, well, you're gonna pay what Apple tells you to pay for that.

Dell has to fight it out with Lenovo and HP (among others) for anyone shopping around for a PC workstation, so they need to offer more varied options to make sure that they can come in with the lowest price possible, instead of forcing people to bump up tiers by not selling all the possible setups.

repiv
Aug 13, 2009

:munch:

https://twitter.com/deletescape/status/1291405690301550592

Dr. Fishopolis
Aug 31, 2004

ROBOT

Paul MaudDib posted:

if you are only installing 96GB in that system you are wasting it and you didn't need a workstation in the first place. You can put 128GB on a 3950X if you want. And of course taking a workstation and only populating it to AM4 level capacity is an absolutely terrible value proposition, regardless of brand. That is what I'm specifically complaining about.

the reason this specific video is $11k is that he's setting up the tests to be something that the consumer PC will be capable of handling. But the workstation can be populated much higher and handle tasks that the consumer PC can't, and that is the reason you buy the workstation, not because it's faster at small consumer-level cinebench/cinema 4D workloads.

Okay sure but literally no person, not a single soul is buying a mac pro to train 1tb+ datasets. Your argument makes sense from a processor market segmentation standpoint but Apple is selling something that's ostensibly a media production workstation. There shouldn't even be xeons in there to begin with.

BlankSystemDaemon
Mar 13, 2009



punk rebel ecks posted:

Didn't Apple use to do this when they had PowerPC which is why they dominated the media creation world?
PowerPC used to have easily-demonstrable benefits for image manipulation, in that it in a single cycle can do FP and maths operations really efficiently compared to an equivalent-era Intel processor, something that stayed true especially after they got the AltiVec unit in the PowerPC 7400 for the G4 - along with SMP, 64bit FPU, cache-coherency optimizations, and other things.
Eventually, Intel caught up and surpassed IBM though, and with neither POWER nor ARM ISAs being strictly RISC, it's looking like all architectures will eventually converge at around the same clock at the same node, although that might be more of a hope for increased competition and for Moore's law to finally come into effect (given that the most used word in the paper that inspired Moore's law is price, it's a bit sad that prices haven't really changed for a very very long time).

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dr. Fishopolis posted:

Okay sure but literally no person, not a single soul is buying a mac pro to train 1tb+ datasets. Your argument makes sense from a processor market segmentation standpoint but Apple is selling something that's ostensibly a media production workstation. There shouldn't even be xeons in there to begin with.

yeah I mean it's probably true that very few of apple's customers really need a high-end workstation tier product. Maybe the pixar types who want to do gargantuan models and stuff.

Normally it's a thing for computational simulations and in-memory databases (SAP HANA, etc) but those aren't the customers you think of as flocking to apple.

WhyteRyce
Dec 30, 2001

:lol:
https://twitter.com/deletescape/status/1291422841834016770

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

Hachi machi! :f5:

Adbot
ADBOT LOVES YOU

WhyteRyce
Dec 30, 2001

priznat posted:

Hachi machi! :f5:

quote:

"If you find password protected zips in the release the password is probably either "Intel123" or "intel123". This was not set by me or my source, this is how it was acquired from Intel."

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply