Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
teagone
Jun 10, 2003

That was pretty intense, huh?

SourKraut posted:

I'm going to take inventory of my PC "boneyard" starting tonight with the hope of having a list of everything available that I really just want out of the house (well, really, that my wife would like out of the house!). Hopefully you both have a good idea now, but once I have a list and pictures up, I'll probably make a SA Mart thread to put them in and give you both links solely so I can keep track of where stuff is going.

Teagone, you will get first dibs on anything you would like to have for your goddaughter, and then Potato Salad you'll be next up for anything. After that, I'll just open it up to other Goons.

Edit: Alternatively, Mods, I know SA Mart is normally the place for transactions/etc., but for this year given Covid and how bad the year has been, would it be acceptable to make a Holiday Goonmas Thread or something where those of us it seems with stuff that can be shared with others, could simply post what we have and so the goons in SHSC could see?

Thanks again so much for this :unsmith:

We've been discussing stuff over PMs, but I suppose I can just wait to see everything you have readily available once you take inventory. I don't want to take anything that might leave PotatoSalad stranded considering how generous both you and Hed are being. Want to make sure everyone can get the help they need :)

Adbot
ADBOT LOVES YOU

movax
Aug 30, 2008

SourKraut posted:

Edit: Alternatively, Mods, I know SA Mart is normally the place for transactions/etc., but for this year given Covid and how bad the year has been, would it be acceptable to make a Holiday Goonmas Thread or something where those of us it seems with stuff that can be shared with others, could simply post what we have and so the goons in SHSC could see?

I'll go ask / check what the other modmins think — all it takes is one person to completely ruin it for everyone and I know similar things have been tried in the past, but I like the spirit!

Potato Salad
Oct 23, 2014

nobody cares


movax posted:

I'll go ask / check what the other modmins think — all it takes is one person to completely ruin it for everyone and I know similar things have been tried in the past, but I like the spirit!

If someone gets left high and dry, I'm sure the rest of us would shower them with candy and questionable thumbdrives

sudo rm -rf
Aug 2, 2011


$ mv fullcommunism.sh
/america
$ cd /america
$ ./fullcommunism.sh


DrDork posted:

Yeah, Intel isn't nearly as "hosed" as a lot of people seem to think. They're still very competitive in gaming, despite what AMD says, and should re-take the undisputed crown early next year.

They still own something like 90% of the datacenter space. AMD is making some inroads, true, but there's an enormous amount of software and support stuff that goes into major contracts, not just a matter of price-per-core. Amazon and Google rolling their own CPUs is certainly a long-term threat, but Intel has a lot of room to cut prices there if they start seeing it actually impacting their bottom line.

I mean, don't get me wrong, they need to figure out their sub-14nm processes, because they're gonna eventually get crushed in a way that actually does impact the bottom line if they don't. But even with the "sky is falling" stuff some people have been noting, the last few quarters have been some of Intel's most profitable ever. So, yeah, they're doin' fine as a business.

not at all to say that this is going to happen to intel, but the “most profitable quarter” ever reminds me of how RIM’s blackberry sales peaked in 2011, years after the introduction of the iphone and android. RIM was still hosed.

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!

sudo rm -rf posted:

not at all to say that this is going to happen to intel, but the “most profitable quarter” ever reminds me of how RIM’s blackberry sales peaked in 2011, years after the introduction of the iphone and android. RIM was still hosed.

maybe, but they've likely got a few years left to coast in and figure themselves out.

Apple's never going to sell their chips to anybody else and nobody's really making anything competitive with either the A-Series on ARM, or with Intel/AMD as far as "could power a PC" is concerned, so where's this big Intel usurper supposed to come from on the non-Apple side if we're talking ARM supplanting x86? Is it gonna be Qualcomm? Not likely. Samsung? NVidia? AMD?

It's a weird time for sure.

MeruFM
Jul 27, 2010
I could see nvidia trying, but Apple's dominance here is the result of 10 years of spending ungodly amounts of money into design while simultaneously shoveling truckloads of money into TSMC

Could apple's marketshare double in a few years? Possible but that's still only like 20%.

They could take 90% market of all high end computers over 1k and it would still not be 50% of the market.

wargames
Mar 16, 2008

official yospos cat censor

MeruFM posted:

They could take 90% market of all high end computers over 1k and it would still not be 50% of the market.

They won't take 90% of HEDT, apple computer market isn't only there to those can tolerate macos, and that is sub 10% of the market, always has been.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Apple also already had a large portion of the $1000+ laptop market. I'm not sure the M1 really changes the game there: if you already were open to a Mac, now you get an even better one. If you want a "gaming laptop" or otherwise need Windows for stuff, it's still not a real option.

In the desktop / laptop space, Intel's real competition right now is obviously AMD. In the server space their competition is AWS and possibly whatever NVidia puts together between their GPUs, Mellanox connections, and ARM CPUs in a few years.

Bofast
Feb 21, 2011

Grimey Drawer
Intel is still going to hurt a lot more in the server and laptop markets once the newest architecture actually gets put into Epyc and mobile Ryzen chips, and the more high end server sales they lose out on the less they can subsidize their lower end desktop chips.

If Intel gets their manufacturing nodes working properly so that they can actually make commercially sensible chips on something that isn't just another 14nm tweak, then they should be able to bounce back, but that's still a big question mark. Looking at how bad their 10nm process still is years after it was supposed to be up and running, I hope they can at least get their act together with the next node (7nm?).

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Yeah, at this point it seems pretty clear that 10nm is never going to pan out in terms of actual mass fab capacity, and they're pinning their hopes on being able to move past it to 7nm. Hopefully they actually learned some lessons from failing at 10nm and won't just repeat the same thing on 7nm and we end up talking about how hilariously bad their 13th gen CPUs on 14++++++++++++++++++ are in 2023.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

Bofast posted:

Intel is still going to hurt a lot more in the server and laptop markets once the newest architecture actually gets put into Epyc and mobile Ryzen chips, and the more high end server sales they lose out on the less they can subsidize their lower end desktop chips.

There's a maximum limit to how much they can hurt, though; TSMC and Samsung have finite amounts of fabrication capacity, and even if every datacenter customer and laptop OEM decided they wanted AMD, a bunch of them would (may already) have to settle for Intel. And Intel is still selling chips as fast as they can make them; Dell's backlogged on laptop orders right now in part because they can't get enough chips from Intel.

mobby_6kl
Aug 9, 2009

by Fluffdaddy

Farmer Crack-rear end posted:

There's a maximum limit to how much they can hurt, though; TSMC and Samsung have finite amounts of fabrication capacity, and even if every datacenter customer and laptop OEM decided they wanted AMD, a bunch of them would (may already) have to settle for Intel. And Intel is still selling chips as fast as they can make them; Dell's backlogged on laptop orders right now in part because they can't get enough chips from Intel.
In the short term, at least... but AMD somehow survived making useless garbage for a decade so I'm sure Intel has some space to unfuck themselves.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

MeruFM posted:

I could see nvidia trying, but Apple's dominance here is the result of 10 years of spending ungodly amounts of money into design while simultaneously shoveling truckloads of money into TSMC

Could apple's marketshare double in a few years? Possible but that's still only like 20%.

They could take 90% market of all high end computers over 1k and it would still not be 50% of the market.

Apple's product strategy is larger than just laptops. They're trying to position iPads as their alternative to lower-cost PC laptop hardware. This shows up in their marketing - "your next computer isn't a computer," all the performance comparisons that put iPads up against "the best selling PC laptop," and so forth.

More importantly, it's showing up in their technical decision-making, and not just in putting Apple-designed ARM CPUs into Macbooks. They shipped iOS app compatibility in Mac OS, even though it's janky as hell right now, because they want to push developers towards making single apps that work more or less seamlessly across tablets and laptops.

Apple expects that laptops, as a category, are going to go the way of desktops: they'll stick around as a product category, but they'll transition away from "almost every home has one" to "people don't have one unless they specifically need one." They expect to pick up big chunks of the total "consumer-focused computing hardware" marketshare, without necessarily dominating middle-tier laptops.

PC LOAD LETTER
May 23, 2005
WTF?!

mobby_6kl posted:

In the short term, at least... but AMD somehow survived making useless garbage for a decade so I'm sure Intel has some space to unfuck themselves.

AMD made it work by giving up their fabs which while it saved the company also meant they'd be eventually forced into a situation like they currently are with TSMC's 7nm supply no matter what.

Having your own fab is great if you can keep your ASP's up while selling volume. You're pretty much printing money in that scenario!

But if you're forced to make big cuts in your ASP's, even while selling volume, you'll start burning through cash fast since fabs and the constant R&D needed to stay competitive requires heaps of cash to keep running.

Personally, unless their 7nm process turns out to be as big of a shitshow* as their 10nm did, I don't think Intel is going anywhere but I also think they're not going to be the same going forward either.

*there are rumors of serious issues with it but so far nothing at all like with 10nm

edit: nothing is stopping them from doing that. They might not WANT to do it for various business reasons but wouldn't stop them. That is part of the reason why I'm not too worried about Intel disappearing. \/\/\/\/\/\/

PC LOAD LETTER fucked around with this message at 19:59 on Nov 22, 2020

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!
what's realistically stopping Intel from moving aggressively into ARM development and manufacturing of their own/ramping up their current involvement? I'm not saying a complete shift away from x86 but it's clear that they've always struggled to provide compelling low-power stuff going back to Atom. Do they still have StrongArm IP? Is that still valuable? Are they doing anything there?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Farmer Crack-rear end posted:

There's a maximum limit to how much they can hurt, though; TSMC and Samsung have finite amounts of fabrication capacity, and even if every datacenter customer and laptop OEM decided they wanted AMD, a bunch of them would (may already) have to settle for Intel. And Intel is still selling chips as fast as they can make them; Dell's backlogged on laptop orders right now in part because they can't get enough chips from Intel.

TSMC is building a fab in Arizona now. Intel clearly is willing to explore the possibility of using third-party fabs, I wonder if that deal included significant permanent wafer allocations in return for Intel paying for a big chunk of it.

The wafer capacity is only 20k per month so it's not a giant fab, like AMD alone uses something like 16k wafers per month (200k wafer allocation in 2021), but it's a start. I've kinda been wondering if this is aimed at the defense market, like presumably the defense/intelligence communities wouldn't be too hot on a foreign-supplied chip, either from a supply-chain perspective or the potential for stealthy hardware-level trojans. Pinning bits in RDRAND internal state can significantly reduce the entropy of the output, with very little visible evidence of the change, for example. Losing the domestic capacity for leading-edge fabrication has to be concerning from a strategic perspective, right now if you're not on TSMC you ain't poo poo.

Paul MaudDib fucked around with this message at 20:31 on Nov 22, 2020

Cygni
Nov 12, 2005

raring to post

Apple will never be able to compete with ARMs designs
*does so easily*
Ok well Apple will never be able to make a GPU as good as the PowerVR ones they buy
*does so easily*
Ok well Apple will never be able to scale their phone SoCs to be able to compete at x86 voltages
*does so easily*
Ok well Apple will never be able to make a wrapper/abstraction layer that lets x86 code run at acceptable speeds
*does so easily*
Ok well Apple will never be able to scale those dominant laptop ARM parts to compete in even HIGHER voltage desktops <-----we are here

(not saying this is anyone in this thread, im just goofin)

Frankly Apple, Nvidia, and Nintendo are all in on ARM-Land at the moment, and those are the three tech giants that seem to actually have product people involved in their decision making. I'm looking forward to my 256 core Nvidia ARM CPU that draws 500W.

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.

Ok Comboomer posted:

what's realistically stopping Intel from moving aggressively into ARM development and manufacturing of their own/ramping up their current involvement? I'm not saying a complete shift away from x86 but it's clear that they've always struggled to provide compelling low-power stuff going back to Atom. Do they still have StrongArm IP? Is that still valuable? Are they doing anything there?

Two things. Firstly, a lot of their value comes from x86 being entrenched in the home PC market and their being in a duopoly with AMD (which was effectively a monopoly for most of the last decade since AMD didn't have competitive products) for the supply of x86 CPUs. (Frankly that's also where a lot of AMD's value comes from too. They both have a common interest in keeping x86 as entrenched as possible.) If they move to ARM they will no longer only be competing against one company for market share but a bunch of them so they likely wouldn't be able to make the same margins as they currently do. Secondly, they would go from owning the architecture and using it for free to having to pay a license fee which would further cut their profit margins. If the Nvidia sale goes through, they would also end up paying those fees to one of their direct competitors.

Since some of the gains that the M1 is showing do seem to be related to the RISC-style uniformity of the instruction set (specifically the huge ROB and amazing instruction-level parallelism) in a way that would be hard to replicate just by throwing more transistors at the x86 decoder (like they were able to do in the past to compete with RISC architectures) I could see Intel (or AMD) releasing a new "Risc86" architecture that preserves (most of) the semantics of x86(-64) (i.e. something close enough you could automatically translate like 95%+ of x86 assembly to it) but with an instruction encoding that's more uniform and easier to decode enabling them to implement those same type of optimizations. Presumably the chips would continue to include an x86(-64) decoder to run legacy software as well, making the transition as smooth as possible.

Mr.Radar fucked around with this message at 20:40 on Nov 22, 2020

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Paul MaudDib posted:

I've kinda been wondering if this is aimed at the defense market, like presumably the defense/intelligence communities wouldn't be too hot on a foreign-supplied chip, either from a supply-chain perspective or the potential for stealthy hardware-level trojans. Pinning bits in RDRAND internal state can significantly reduce the entropy of the output, with very little visible evidence of the change, for example. Losing the domestic capacity for leading-edge fabrication has to be concerning from a strategic perspective, right now if you're not on TSMC you ain't poo poo.

This is explicitly part of the reason for the factory. The US government in general has security concerns about foreign production of semiconductors, and wants domestic capacity for high-end chips. DOD has also been considering throwing money at Intel to try to help them get rolling again in the process game.


Mr.Radar posted:

Since some of the gains that the M1 is showing do seem to be related to the RISC-style uniformity of the instruction set (specifically the huge ROB and amazing instruction-level parallelism) in a way that would be hard to replicate just by throwing more transistors at the x86 decoder (like they were able to do in the past to compete with RISC architectures) I could see Intel (or AMD) releasing a new "Risc86" architecture that preserves (most of) the semantics of x86(-64) (i.e. something close enough you could automatically translate like 95%+ of x86 assembly to it) but with an instruction encoding that's more uniform and easier to decode enabling them to implement those same type of optimizations. Presumably the chips would continue to include an x86(-64) decoder to run legacy software as well, making the transition as smooth as possible.

Yeah this is where things will likely head. At some point x86-64 will need replacement, just like x86 did, and for desktop products it will continue to make sense to design an internally new arch that has hardware backwards compatibility while gaining benefit from more modern design knowledge. Unlike in mobile, where you're scrapping for every bit of power and space on relatively small CPUs, on desktop monsters (and to a degree laptop CPUs) it's not a problem to throw some silicon at keeping backwards compatibility. In the end there's going to be some intersection point between where the mobile approach is optimal and where the desktop approach is optimal, and Apple (who are very, very capable of making smart decisions when they want to) have likely made what is the best choice for themselves by arguing that they can at worst be competitive in the laptop space with their unified approach.

K8.0 fucked around with this message at 21:00 on Nov 22, 2020

repiv
Aug 13, 2009

I don't know if x86-64 will ever need replacing in the sense that x86 did, the latter was replaced because we outgrew the 4GB address space but there aren't any obvious brick wall limitations like that in x86-64

Maybe x86-64 will get replaced anyway but without a strong necessity to do so it's less likely to happen

2 exabytes ought to be enough for anyone

Mr.PayDay
Jan 2, 2004
life is short - play hard

repiv posted:


2 exabytes ought to be enough for anyone

!remindme 25 years :smuggo:

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Ok Comboomer posted:

what's realistically stopping Intel from moving aggressively into ARM development and manufacturing of their own/ramping up their current involvement? I'm not saying a complete shift away from x86 but it's clear that they've always struggled to provide compelling low-power stuff going back to Atom. Do they still have StrongArm IP? Is that still valuable? Are they doing anything there?

Intel hosed around with StrongARM for a while, renaming it to Xscale and going through a few generations. In 2006 they sold the entire business unit to Marvell, just in time to avoid being part of the smartphone gold rush. (NB: they already had Xscale products targeted at phones at the time of the sale!)

When they realized what a terrible mistake they'd made, Intel spent a ton of money trying to cram x86 IP, in the form of Atom derivatives, into phones. It did not work out. Performance wasn't great, especially because running ARM binaries through a translation layer (which was a thing Intel had to do) did not produce great results.

I'm also remembering something about it being difficult for Intel to get phone OEMs to deal with any of the pain of shipping a non-ARM Android image. Apple can manage to pull off these architecture switches because they own both the hardware and the OS, but in PC and Android world, it's a nightmare because there's so many different organizations with different capabilities and directions.

The products were so unappealing that Intel had to subsidize their way into a handful of phone models. Sales were predictably tepid, and eventually Intel gave up on shoveling money into a fiery inferno.

nitsuga
Jan 1, 2007

Potato Salad posted:

I'm putting together parts for a friend who is finally (!!!!!) going back to school in January. I have an old mitx case, an evo 840, a small budget gpu, and a usb wifi dongle gathered already and I'd absolutely be down to pay a fair rate for a psu, cpu, ram, or mobo.

Sent you a PM.

WhyteRyce
Dec 30, 2001

I'm not optimistic about 7nm because the same managerial, organizational, and political issues are probably still there. Even if it's not the 10nm gently caress up that's not a high bar

A huge part of Intel struggles with phones is that they knew nothing about the market or customers. Which is odd because you'd think from experience they'd know the difficulties and hoops involved just launching a new platform on even things like poo poo tier garbage laptops. But I guess you get used to captive customers who beg for help instead of telling you to pound sand. Jumping back into ARM and having a non poo poo hardware solution is only half of that.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Cygni posted:

(not saying this is anyone in this thread, im just goofin)

Thing is, it's not "easily." Apple has been able to do that because they've had billions of dollars in cash they could sink into R&D without an immediate pay-off because they were convinced it'd be worth it 3-5 years down the road. Most companies can't afford to do the same thing, and unless Apple decides to share (lol), that part isn't going to change in the near future.

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!

DrDork posted:

Thing is, it's not "easily." Apple has been able to do that because they've had billions of dollars in cash they could sink into R&D without an immediate pay-off because they were convinced it'd be worth it 3-5 years down the road. Most companies can't afford to do the same thing, and unless Apple decides to share (lol), that part isn't going to change in the near future.

They’ve also had some wildly lucrative mobile product lines based on the same tech to fund, justify, and drive all of that development.

The A-Series chips of a decade ago are exponentially less capable than the A14 and M1 of today. And yet from the A4 onward that work always had relatively immediate economic payoff in the form of extremely profitable Apple products sporting A-Series chips.

If they’d tried to just develop an ARM-based PC chip line in secret, divorced from anything else (like if they’d just kept making Macs and iPods and shipping phones + tablets with Samsung or Qualcomm CPUs like it’s 2009) it wouldn’t ever have remotely worked.

Jenny Agutter
Mar 18, 2009

Is a cooler master hyper212 evo enough cooler for an i9-10850k? I'm hitting very high temps at load (100C) and I'm not sure if i need to just repaste or if the trusty old hyper212 just isn't enough for contemporary processors

shrike82
Jun 11, 2005
Probation
Can't post for 5 hours!
apple approaching their PCs from a mobile first perspective kinda limits where they'll go. they don't seem interested in cloud compute and i'm skeptical they're interested in making a serious play for the HEDT/local server space. they can make a play for any consumer electronics/computer space they care about so it's more a matter of management focus and entering a market which is either strategically important or actually pushes their financials.

the m1's strength is its battery life - otoh, it's a pity all that processing power is limited to their software ecosystem. outside of a limited set of creative software, you can't really apply the power on stuff like gaming.

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!

shrike82 posted:

apple approaching their PCs from a mobile first perspective kinda limits where they'll go. they don't seem interested in cloud compute and i'm skeptical they're interested in making a serious play for the HEDT/local server space. they can make a play for any consumer electronics/computer space they care about so it's more a matter of management focus and entering a market which is either strategically important or actually pushes their financials.

the m1's strength is its battery life - otoh, it's a pity all that processing power is limited to their software ecosystem. outside of a limited set of creative software, you can't really apply the power on stuff like gaming.

that’s not what the usage reports seem to be showing, but yeah Macs are only for music producers or whatever. :rolleyes:

relative paucity of MacOS games aside, those that are being tried are posting often very impressive returns. You just have to be ok playing Battletech or WoW or Fortnite or Shadow of the Tomb Raider or whatever...

WRT cloud apple seem pretty happy running other people’s tech in their own data centers and seem like they’d be extremely happy to get every developer to buy a MacBook Pro to connect to Azure or Amazon services. I suppose at this point they really don’t see the value in even thinking of pursuing that space, for all that some of their hardware could hypothetically excel in it.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Jenny Agutter posted:

Is a cooler master hyper212 evo enough cooler for an i9-10850k? I'm hitting very high temps at load (100C) and I'm not sure if i need to just repaste or if the trusty old hyper212 just isn't enough for contemporary processors

CPUs are always going to downclock before frying themselves when left to their own devices, but if you're overclocking probably not. The 212 was good like a decade ago but isn't top of the line anymore for fighting off such big heat

redeyes
Sep 14, 2002

by Fluffdaddy
Yeah the answer is gently caress no.

Jenny Agutter
Mar 18, 2009

gradenko_2000 posted:

CPUs are always going to downclock before frying themselves when left to their own devices, but if you're overclocking probably not. The 212 was good like a decade ago but isn't top of the line anymore for fighting off such big heat

I’ve been using them for a decade so I didn’t put much thought into it but I guess the world has moved on. What’s the suggestion these days for cooling 10 cores? Would like to stick to air

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Jenny Agutter posted:

I’ve been using them for a decade so I didn’t put much thought into it but I guess the world has moved on. What’s the suggestion these days for cooling 10 cores? Would like to stick to air

a Be Quiet Dark Rock or one of the Noctua ones

and yeah I don't blame you - my old machine from 2013 is still on a 212 and it still runs like a champ cooling a quad-core

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Ifn you got the dosh you can't go wrong with a big ol' Noctua.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Jenny Agutter posted:

Is a cooler master hyper212 evo enough cooler for an i9-10850k? I'm hitting very high temps at load (100C) and I'm not sure if i need to just repaste or if the trusty old hyper212 just isn't enough for contemporary processors

apart from the usual "repaste your poo poo and try it again" advice no, not really, Hyper 212 Evo is ok but it's not super by modern standards.

I'd be looking at a NH-D15S, a Scythe Fuma 2, or at very minimum a Mugen Rev B.

Fats
Oct 14, 2006

What I cannot create, I do not understand
Fun Shoe

Some Goon posted:

Ifn you got the dosh you can't go wrong with a big ol' Noctua.

Yep, just get a D15 if it'll fit in your case. Nice thing about the Noctuas is that they usually have adapters for future sockets, too, so you can pull it out 5 years later and put it in your next machine. And their fans are the best color.

~Coxy
Dec 9, 2003

R.I.P. Inter-OS Sass - b.2000AD d.2003AD
The Hyper 212 was always a budget small/medium tower cooler. 4 heatpipes, fairly narrow and I presume you're using it with a single fan.

Any big tower cooler with 6 heatpipes and more fin surface area and two fans will be quite a bit better. Make sure it can clear your RAM and fits in your case besides.

Laslow
Jul 18, 2007

Ok Comboomer posted:

WRT cloud apple seem pretty happy running other people’s tech in their own data centers and seem like they’d be extremely happy to get every developer to buy a MacBook Pro to connect to Azure or Amazon services. I suppose at this point they really don’t see the value in even thinking of pursuing that space, for all that some of their hardware could hypothetically excel in it.
Yep. Cloud is a race to the bottom and they don’t play that game.

Jenny Agutter
Mar 18, 2009

i actually replaced the single fan with a noctua 120mm fan which led to me checking the temps and realizing the issue. if i get the D15 with a single 140mm fan would it make sense to put the 120mm i already have on it as well, or will they end up causing issues? or should I just trial and error it? this is all in a meshify c

Adbot
ADBOT LOVES YOU

GRINDCORE MEGGIDO
Feb 28, 1985


Try it with its included fan.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply