|
If you're happy with the Switch experience you could play at low settings and cap your framerates really aggressively to keep the load down. Depending on how the thermal management is set up that may keep the fans under control. But yeah in terms of pure convenience nothing is remotely close to a Switch for portable gaming. Portable versions of high-performance computing parts are never going to compete on efficiency with actual mobile parts.
|
# ? Aug 4, 2020 02:58 |
|
|
# ? Apr 26, 2024 01:56 |
|
DrDork posted:Not sure which iteration of Razer's laptops you were using, but the current Stealth 15 starts with a 2070 and 144Hz or 4k OLED, which...uh....is considerably different from what you can get out of a MBP, regardless of price. Their keyboards are take it or leave it, though, and their trackpads try to be MBP ones but don't quite measure up. I dig the black aluminum frame, though. a lot of computers (and phones) that win on bullet-point specs with poo poo like “real 4K resolution” on a notebook and/or OLED (or frankly touchscreens for the first couple years of Win8/early Win10) end up being way jankier in practice. I see this firsthand on Android phones with high/variable refresh OLED displays. Flagship phone screens straight up changing brightness and color behavior when switching refresh rate. Totally inconsistent off-axis performance. Weird color shifting at different brightness levels. Meanwhile everybody gave Apple poo poo for a hot second for shipping “just another LCD screen” on the iPhone 10R, never mind that it went on to be rated best-in-class for its price bracket in terms of day-to-day QoL. And this isn’t an isolated thing, LCDs on Apple devices routinely outperform OLEDs on similarly priced products from competitors. I’ve used a few Razer notebooks in the last decade and they always feel like 10-20% off from the bar set by a comparably priced MBP. Like, they’re clearly aiming for the same target but they can’t get the fit+finish dialed in that last bit. trilobite terror fucked around with this message at 05:05 on Aug 4, 2020 |
# ? Aug 4, 2020 05:03 |
|
A decent number of 5775C seem to have hit the surplus market in China and are now on eBay at the $90-120 range. Arguably not a bad deal if you’re looking to upgrade a 4690K system or whatever, if you can catch it bouncing down to $90 again and flip your 4690K for like 50 bucks. I ordered one when they were first listed and it showed up today, haven’t had a chance to install it but it externally appears to be a normal, non-ES chip just as advertised. I’m really curious where these would ever have been used such that they’re now finding their way onto the surplus market. Someone speculated to me maybe an Apple product, perhaps the iMac Pro? But even then you would think they would have used the 5775R and not the socketed version. I really didn’t know that anyone ever used socketed 5775Cs in any significant quantity in fact. Paul MaudDib fucked around with this message at 05:56 on Aug 4, 2020 |
# ? Aug 4, 2020 05:47 |
|
Paul MaudDib posted:A decent number of 5775C seem to have hit the surplus market in China and are now on eBay at the $90-120 range. Arguably not a bad deal if you’re looking to upgrade a 4690K system or whatever, if you can catch it bouncing down to $90 again and flip your 4690K for like 50 bucks. That's really cool. I always wanted one of those because they had that extra L4 cache and I was curious if it'd help at all while gaming, but not enough to spend money on it at this point. My 4670K has been a trooper but I upgraded past it a little while ago.
|
# ? Aug 4, 2020 08:59 |
|
Rexxed posted:That's really cool. I always wanted one of those because they had that extra L4 cache and I was curious if it'd help at all while gaming, but not enough to spend money on it at this point. My 4670K has been a trooper but I upgraded past it a little while ago. I still have a Z97 mITX system hanging around with a 4690K, and this will be an interesting use for it. Not gonna drop $300 on it like they used to go for, but for $90 (minus whatever I can get for the old CPU) it’ll be a fun toy. It’s a gigabyte board so it also should be fairly painless to do hackintosh if I want. Sadly the rest of the build itself isn’t worth a whole lot anymore, so bumping it to 4C8T and using it as a secondary PC for whatever is probably the best I can do with it. Paul MaudDib fucked around with this message at 09:26 on Aug 4, 2020 |
# ? Aug 4, 2020 09:21 |
|
They used to be highly rated for frame pacing because of the edram but it's old hat nowadays. I remember a big fuss being kicked up about them just about edging past the 6700k by a knife's edge in games. Of course it was like a day later that the whole core count thing exploded.
|
# ? Aug 4, 2020 09:23 |
|
Ok Comboomer posted:a lot of computers (and phones) that win on bullet-point specs with poo poo like “real 4K resolution” on a notebook and/or OLED (or frankly touchscreens for the first couple years of Win8/early Win10) end up being way jankier in practice. You can say what you want about phones, but the 4k OLED's in the Razers are gorgeous, 144Hz is hilariously nicer than 60hz (though you're stuck with 1080p at that point for :reasons: for another ~6 months), and the dGPU options aren't even worth talking about because the MBP hasn't had more than a mid-tier GPU in a decade. So, yeah, at any even vaguely similar price point, the MBP better have superior fit-and-finish, because it's not winning on performance. And that's fine: Razer makes premium gaming laptops, and that means they're working in a bit of a different market than the MBP, and are selling to people who are willing to accept different sets of trade-offs.
|
# ? Aug 4, 2020 14:18 |
|
It's not my primary gaming machine obviously, but I find you can play a surprising amount games on any halfway modern integrated graphics with settings turned down. I've even been using my small Surface Pro 7 for WoW from time to time.
|
# ? Aug 4, 2020 14:25 |
|
DrDork posted:and the dGPU options aren't even worth talking about because the MBP hasn't had more than a mid-tier GPU in a decade 5600M is pretty ridiculous, but I’m definitely not paying $800 for one
|
# ? Aug 4, 2020 14:39 |
|
Ok Comboomer posted:5600M is pretty ridiculous, but I’m definitely not paying $800 for one It's roughly equivalent to a 2060, which pretty much defines "mid-tier" GPUs right now. While it's not too shabby, it's also a considerable way behind what I'd call "ridiculous." Though asking $800 for it is pretty crazy, yeah. But so is asking $700+ for a 1TB SSD like Lenovo does, so
|
# ? Aug 4, 2020 15:18 |
|
Ok Comboomer posted:5600M is pretty ridiculous, but I’m definitely not paying $800 for one You can buy complete laptops with a 5600M for $800, I saw the Dell G5 going for that with a 6C/12T Ryzen 4600H.
|
# ? Aug 4, 2020 15:51 |
Currently have an 8700k. Am I fine just ignoring anything CPU related until DDR 5 comes out? Like if I were to wait until I wanted to buy new everything for a generational upgrade, would DDR 5 be a good line to draw?
|
|
# ? Aug 5, 2020 02:10 |
|
Coffee Jones posted:Currently have an 8700k.
|
# ? Aug 5, 2020 02:13 |
|
Yes. 8700K/9900K and 3600/3700X/3900X are perfectly fine holding out until DDR5. You might even want to wait until the second generation of DDR5 when everything is mature. The next 6 months are going to be a significant incremental improvement (the first since the 8700K really) but nothing you won’t die without. Think Sandy Bridge -> Ivy Bridge type transition. Better? Yes. Can't live without? No. 8700K is still the top performer for gaming basically. Delid and/or overclock if you’re feeling the need for speed and you haven't already. For productivity, the 3900X and threadripper exist but you'll know if you're doing things that need it. Paul MaudDib fucked around with this message at 02:35 on Aug 5, 2020 |
# ? Aug 5, 2020 02:30 |
|
Paul MaudDib posted:Think Sandy Bridge -> Ivy Bridge type transition. Better? Yes. Can't live without? No.
|
# ? Aug 5, 2020 04:20 |
I have this trusty old Latitude E6430 and the i7-3720QM is a little hand dryer, even after replacing the fans and heat sink and disabling the Nvidia GPU ... I guess that’s why.
|
|
# ? Aug 5, 2020 05:42 |
|
Coffee Jones posted:Currently have an 8700k. Unless you have an actual need for more cores for work, wait For gaming? You already basically have the best gaming cpu
|
# ? Aug 5, 2020 15:22 |
|
Since we were talking about Macs before I find this video very interesting: https://www.youtube.com/watch?v=jzT0-t-7-PA
|
# ? Aug 6, 2020 08:31 |
|
punk rebel ecks posted:Since we were talking about Macs before I find this video very interesting: isn't there a Mac Pro that runs with a Ryzen/Threadripper CPU? I could almost swear LTT did one a while back
|
# ? Aug 6, 2020 09:28 |
|
gradenko_2000 posted:isn't there a Mac Pro that runs with a Ryzen/Threadripper CPU? I could almost swear LTT did one a while back
|
# ? Aug 6, 2020 10:10 |
|
gradenko_2000 posted:isn't there a Mac Pro that runs with a Ryzen/Threadripper CPU? I could almost swear LTT did one a while back Yep, the guy in the video built one a few months back too. Edit: he’s built a bunch of them apparently...at different budgets...using laptops....etc. https://youtu.be/AXg9sMuGxB0 trilobite terror fucked around with this message at 15:28 on Aug 6, 2020 |
# ? Aug 6, 2020 15:24 |
|
Would be interesting to see how much the optimization advantage would suffice using a Hackintosh. That video doesn't compare first party Apple software.
punk rebel ecks fucked around with this message at 16:25 on Aug 6, 2020 |
# ? Aug 6, 2020 16:22 |
|
Yeah, it's sort of a dumb excuse. "Apple can afford to sell computers with otherwise bad price/performance because the software is so optimized!" "but uh... what if you optimized the software and also sold us a computer that was actually a good deal?" *surprised pikachu face*
|
# ? Aug 6, 2020 18:39 |
|
punk rebel ecks posted:Since we were talking about Macs before I find this video very interesting: The gotcha with all of these videos is that the only way you get to a $50k mac is by stuffing it full of RAM sticks that are each like $2k even on the aftermarket let alone once they're sourced through an OEM, and none of the systems they're building are even capable of supporting that much RAM in the first place. If you build a truly equivalent workstation with 1.5 TB of RAM it's still gonna be at least $30k, let alone once you add the markup of going through an OEM. picking "$11k mac" as a specific number is done specifically because that's the largest amount of RAM that the "competing PC" will support, so on the apple side you are taking a dump truck and racing it against a ferrari on the PC side. Now benchmark that $11k PC with 256GB of memory on an in-memory workload with a 1.5 TB working set and watch it run thousands of times slower than the mac. Go ask HP to spec you a 1.5 TB workstation with one of those new TRX80 motherboards (they support RDIMM/LRDIMM so they can hit the memory capacities) and see how much that comes out to. The number will be not terribly dissimilar to that $50k mac. Probably faster for the price, but the $11k machine you put together in your garage isn't remotely comparable here. If you're just running cinebench - yeah who cares, you didn't need a $50k workstation in the first place. That's not why you'd be buying a workstation with 1.5 TB of RAM. Paul MaudDib fucked around with this message at 18:54 on Aug 6, 2020 |
# ? Aug 6, 2020 18:47 |
|
Dr. Fishopolis posted:Yeah, it's sort of a dumb excuse. "Apple can afford to sell computers with otherwise bad price/performance because the software is so optimized!" Didn't Apple use to do this when they had PowerPC which is why they dominated the media creation world? Paul MaudDib posted:The gotcha with all of these videos is that the only way you get to a $50k mac is by stuffing it full of RAM sticks that are each like $2k even on the aftermarket let alone once they're sourced through an OEM, and none of the systems they're building are even capable of supporting that much RAM in the first place. If you build a truly equivalent workstation with 1.5 TB of RAM it's still gonna be at least $30k, let alone once you add the markup of going through an OEM. I'm too stupid to understand this.
|
# ? Aug 6, 2020 18:53 |
|
punk rebel ecks posted:I'm too stupid to understand this. Mac Pro supports up to 1.5 TB of RAM because it has RDIMM support (different type of memory that allows a much larger capacity per stick). Memory is expensive so actually populating this system to its maximum capacity is extremely expensive and that's how you get to ludicrous numbers like a "$50k mac". The PCs they're building are consumer tier gear that doesn't support RDIMMs, thus it only supports 256 GB of RAM. But it has a lot more cores, so if what you are doing isn't particularly memory-heavy it will be a lot faster for a lot less money. But, if you tried to put that PC with 256GB of RAM into a workload that needed the workstation's 1.5 TB of RAM, it would suck rear end because it would be constantly paging out to disk, it would be thousands of times slower. The consumer build just can't do some things that the workstation can handle easily, even though it's got more cores and faster cores. These comparisons are always specifically only testing the workloads that the consumer rig can do, and the workstation is of course not quite as good in performance terms and offers terrible value for those tasks. You're putting a ferrari up against a dump truck in a race - of course the ferrari wins, unless the test is to see how fast you can move 50 tons of dirt, in which case the ferrari doesn't do so well. Additionally, ordering things from OEMs is just more expensive in general. That 128GB stick of RDIMM memory is $2k if you go and buy it yourself, but if you buy it from HP it's going to be double that. Apple is no different. It's like building a PC yourself vs buying a prebuilt, the prebuilt is always worse value (but also has support bundled in/etc). Net effect: if you spec out a comparable workstation that can handle 1.5 TB of RAM from a competing vendor, it's going to be a bit cheaper, or somewhat faster for the same money. But a lot of that "$50k" mac is actually just the fact that they've maxed out the RAM, and RAM is really expensive (>$2k per stick not from an OEM). There is probably $30-40k of RAM in that $50k mac workstation. A comparable machine is not going to be $11k from another vendor, it is going to be $40k, or $50k and you get a build that will be better than the Mac Pro. Paul MaudDib fucked around with this message at 19:10 on Aug 6, 2020 |
# ? Aug 6, 2020 19:02 |
|
Paul MaudDib posted:Mac Pro supports up to 1.5 TB of RAM because it has RDIMM support (different type of memory that allows a much larger capacity per stick). Memory is expensive so actually populating this system to its maximum capacity is extremely expensive and that's how you get to ludicrous numbers like a "$50k mac". Yeah, as a random point of reference, a Dell Precision 7920, which supports RDIMMs, will cost you $24,300 just for 1.5TB of RAM. If you really want to get silly, it supports up to 3TB of the stuff--for $91,000 (plus the rest of the computer, including a minimum ~$4000 Xeon CPU). Depending on what CPU you feel like selecting to pair with the 1.5TB RAM, the Dell comes out pretty similarly priced in the end.
|
# ? Aug 6, 2020 19:16 |
|
I just watched that video, where are you getting that $50k comparison? From that one comment at the start? Nothing he's doing seems relevant to your post, I'm super confused. Those machines only have 96gb of ram installed.
|
# ? Aug 6, 2020 19:16 |
|
Dr. Fishopolis posted:I just watched that video, where are you getting that $50k comparison? From that one comment at the start? Nothing he's doing seems relevant to your post, I'm super confused. Those machines only have 96gb of ram installed. when the mac pro was launched there was a flood of "omg $50k mac! I can build a threadripper for like $4k!" and I'm responding to that as much as this specific video. The point is that's not a fair comparison, that mac spec has a lot of capabilities that the threadripper doesn't, namely having six times the RAM. if you are only installing 96GB in that system you are wasting it and you didn't need a workstation in the first place. You can put 128GB on a 3950X if you want. And of course taking a workstation and only populating it to AM4 level capacity is an absolutely terrible value proposition, regardless of brand. That is what I'm specifically complaining about. the reason this specific video is $11k is that he's setting up the tests to be something that the consumer PC will be capable of handling. But the workstation can be populated much higher and handle tasks that the consumer PC can't, and that is the reason you buy the workstation, not because it's faster at small consumer-level cinebench/cinema 4D workloads. It's not really an intellectually honest video, they're setting up a pre-determined conclusion by using a workstation for consumer workloads, and unsurprisingly it ends up being a horrible value proposition. And it's trendy to dunk on Apple (which is justified, their PC hardware sucks) but in this specific case you could substitute any other OEM and end up with an equally bad result - like DrDork says, populating 1.5TB on a Dell is $25k just for the memory alone, depending on the CPU you end around $50k just like with the Mac Pro, and taking that Dell and putting 96GB on it would be a horrible waste of money too. The Mac Pro is still a bad deal but it's like "$50k for a machine you can get for $45k elsewhere" level bad, or "$11k for a machine that would cost me $9k elsewhere" level bad, not "$11k mac dunked on by $2k pc". People are just making clickbait videos to cash in on apple hate, the machines they're building aren't comparable, the tasks they're doing are just limited enough not to show the difference. Paul MaudDib fucked around with this message at 20:01 on Aug 6, 2020 |
# ? Aug 6, 2020 19:22 |
|
DrDork's comment about what CPU one selects is very relevant here. In the Xeon W series, Intel uses support for >1TB memory as a market segmentation thing and charges $3000 for it. For example, look at the differences (highlighted in blue) between these two nearly-identical models: https://ark.intel.com/content/www/us/en/ark/compare.html?productIds=193752,193754 Apple uses these "M" suffix Xeons for their 24 and 28 core CPU options, so if you want lots of cores, you're paying $3000 for a fuse bit even if you don't put in lots of RAM. I'm sure this kind of game has nothing to do with why they're switching (actually if that was Intel's only fault I'm sure Apple would still be happy. They're not a workstation company, they could happily live with some bullshit in that segment. The real problem, I assume, is their bread and butter: laptops.)
|
# ? Aug 6, 2020 19:23 |
|
BobHoward posted:Apple uses these "M" suffix Xeons for their 24 and 28 core CPU options, so if you want lots of cores, you're paying $3000 for a fuse bit even if you don't put in lots of RAM. That would seem to be a specific and intentional decision by Apple--Dell lets you select from like 40 different CPUs, rather than 5. More likely, they figure that if you're buying a Mac Workstation that needs 24/28 cores, you're doing that because you're going to fill it up with RAM, because what would you be doing with that many cores otherwise? But yeah, I think you're right and their real grump against Intel is related to delays in 10/7nm and what that means for their laptops. Though AMD is now also saying they're delaying their laptop chips 2 months, so apparently no one can deliver mobile chips in volume right now.
|
# ? Aug 6, 2020 19:32 |
|
DrDork posted:That would seem to be a specific and intentional decision by Apple--Dell lets you select from like 40 different CPUs, rather than 5. Well, yeah - Apple isn't Dell, they want to keep the option list as small as possible. One could argue this isn't the right thing for a workstation, but this is a place where Apple is just going to do its thing and pare down the set of options they present.
|
# ? Aug 6, 2020 19:51 |
|
BobHoward posted:Well, yeah - Apple isn't Dell, they want to keep the option list as small as possible. One could argue this isn't the right thing for a workstation, but this is a place where Apple is just going to do its thing and pare down the set of options they present. True. I mean, at the end of the day, Apple isn't Dell in another important way: they're not competing directly with really anyone else. You buy a Mac Pro because you want a Mac workstation for whatever reason (specific software, integration into your existing environment, etc), and if that means that you end up buying more machine than you strictly need because a more exact fit option isn't available, well, you're gonna pay what Apple tells you to pay for that. Dell has to fight it out with Lenovo and HP (among others) for anyone shopping around for a PC workstation, so they need to offer more varied options to make sure that they can come in with the lowest price possible, instead of forcing people to bump up tiers by not selling all the possible setups.
|
# ? Aug 6, 2020 19:56 |
|
https://twitter.com/deletescape/status/1291405690301550592
|
# ? Aug 6, 2020 20:09 |
|
Paul MaudDib posted:if you are only installing 96GB in that system you are wasting it and you didn't need a workstation in the first place. You can put 128GB on a 3950X if you want. And of course taking a workstation and only populating it to AM4 level capacity is an absolutely terrible value proposition, regardless of brand. That is what I'm specifically complaining about. Okay sure but literally no person, not a single soul is buying a mac pro to train 1tb+ datasets. Your argument makes sense from a processor market segmentation standpoint but Apple is selling something that's ostensibly a media production workstation. There shouldn't even be xeons in there to begin with.
|
# ? Aug 6, 2020 20:10 |
punk rebel ecks posted:Didn't Apple use to do this when they had PowerPC which is why they dominated the media creation world? Eventually, Intel caught up and surpassed IBM though, and with neither POWER nor ARM ISAs being strictly RISC, it's looking like all architectures will eventually converge at around the same clock at the same node, although that might be more of a hope for increased competition and for Moore's law to finally come into effect (given that the most used word in the paper that inspired Moore's law is price, it's a bit sad that prices haven't really changed for a very very long time).
|
|
# ? Aug 6, 2020 20:13 |
|
Dr. Fishopolis posted:Okay sure but literally no person, not a single soul is buying a mac pro to train 1tb+ datasets. Your argument makes sense from a processor market segmentation standpoint but Apple is selling something that's ostensibly a media production workstation. There shouldn't even be xeons in there to begin with. yeah I mean it's probably true that very few of apple's customers really need a high-end workstation tier product. Maybe the pixar types who want to do gargantuan models and stuff. Normally it's a thing for computational simulations and in-memory databases (SAP HANA, etc) but those aren't the customers you think of as flocking to apple.
|
# ? Aug 6, 2020 20:16 |
|
https://twitter.com/deletescape/status/1291422841834016770
|
# ? Aug 6, 2020 20:17 |
|
Hachi machi!
|
# ? Aug 6, 2020 20:19 |
|
|
# ? Apr 26, 2024 01:56 |
|
priznat posted:Hachi machi! quote:"If you find password protected zips in the release the password is probably either "Intel123" or "intel123". This was not set by me or my source, this is how it was acquired from Intel."
|
# ? Aug 6, 2020 20:20 |