Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
Also apple for their iPad cpus.

Adbot
ADBOT LOVES YOU

Yaoi Gagarin
Feb 20, 2014

ufarn posted:

Can someone quantify "thousands of 300mm wafers" for me?

Probably somewhere between 50-100x that in CPUs. I think a wafer makes about that many?

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Craptacular! posted:

Why are people promoting NVENC when I’d rather just put spare cores or a spare computer toward encoding than the video card that’s also occupied with a game? I’d rather tie up excessive CPU threads. That’s better for those of us who are GPU bottlenecks just playing games without streaming.
The video en- and decoders on Geforce cards have dedicated hardware.

Arzachel
May 12, 2012

Craptacular! posted:

Why are people promoting NVENC when I’d rather just put spare cores or a spare computer toward encoding than the video card that’s also occupied with a game? I’d rather tie up excessive CPU threads. That’s better for those of us who are GPU bottlenecks just playing games without streaming.

The GPU overhead is very low since it's using fixed function hardware and not the shaders. The real issue is that NVENC still looks quite a bit worse than CPU encode at equal bitrate.

SwissArmyDruid
Feb 14, 2014

by sebmojo

EmpyreanFlux posted:

It's supposedly limited to 16nm and 12nm wafers, not 7nm so it fucks AMD on consoles and fucks Nvidia on Turing and low end Pascal and fucks Intel on chipsets.

Oh no, low-end Pascal. :rolleyes: =P

I suppose the supply of MX150s will be depressed for a bit, then.

ufarn posted:

Can someone quantify "thousands of 300mm wafers" for me?

300mm is about 12 inches. (11.81" actual)

This image purports to be a 300mm TSMC 16nm wafer. The image shows up almost exclusively in articles talking about Pascal and/or TSMC 16nm, which leads me to believe that those are, in fact Pascal dies, Each square you see in the wafer is a chip, before it's packaged for placement into its mounting solution.



So take that image, and multiply by a couple thousand.

SwissArmyDruid fucked around with this message at 02:10 on Jan 29, 2019

crazypenguin
Mar 9, 2005
nothing witty here, move along

ufarn posted:

Can someone quantify "thousands of 300mm wafers" for me?

I think that's supposed to be a few days (3?) of output for that particular fab. Just a wild-rear end guess, but maybe this means a week of disruption?

I don't think that will mean any price disruptions or shortages or anything, just "aw poo poo we set a bunch of money on fire" for TSMC.

repiv
Aug 13, 2009

Arzachel posted:

The GPU overhead is very low since it's using fixed function hardware and not the shaders. The real issue is that NVENC still looks quite a bit worse than CPU encode at equal bitrate.

Has anyone done an in-depth test of Turings encoder yet? Nvidia was claiming better quality than x264s Fast preset, which is impressive if true.

Khorne
May 1, 2002

MaxxBot posted:

The first link you posted showing a 2080Ti at 1080p Ultra is basically the worst case realistic scenario for Ryzen and there's still only a single game where the difference is in the 25-50% range you stated. The second link shows a 2080Ti at 1080p and settings turned down which is an ultra niche scenario that would hardly ever be actually ran in real life. If you have to use settings that are barely ever used outside of a GPU review to get a 25-50% difference I would not say that's typical.

I always set everything to low in all my games. And I played 1080p 144Hz only until recently. Now I have 1440p 165Hz ips, but I am on an i7 3770k with a 1070 and get > 160 fps in any game I care about. The only exceptions are weird kmmos where getting ~100 fps doesn't matter and battlerite which is insanely not optimized. With the 1070 I can even turn the graphics settings up in some casual games without impacting my fps. Which I couldn't do when I had a 670 (which also got >144fps in esports titles).

I mostly prefer low graphics, or some mix of low+high, for improved visibility. I only really focus on the gameplay when I play games and visuals are kinda meaningless to me. The game's style isn't meaningless, but all of the silly effects and things that detract from gameplay are.

I also plan on buying zen2 if it's close to the 9900k because more cores is more useful to me.

The reality is, outside of 240Hz gaming, it doesn't matter which you buy when it comes to performance. Especially once zen2 comes out. It's real niche to care about framerates so high in those scenarios, and by the time it becomes commonplace, if it ever does, we'll all be upgrading anyway. Most people have poorly ventilated, low-range GPUs and poorly cooled CPUs on bad VRMs and crank everything to max and aren't really CPU bound.

A 3770k with slow RAM has a 4600 single core geekbench v4 score. It was $300 on release. It beats or is within a few percent of many i7 7700 computers, released 5 years later for $320, in single and multi core performance. Intel was releasing the same thing over and over again with creeping prices prior to AMD lighting a fire under them. The 9 series blows the 3770k away stock. I felt insane paying $310 for a CPU when midrange (~$180) was so competitive just 2 years prior, but the performance gap was gigantic. It looks like AMD is bringing in a new era of the competitive midrange, forcing intel to compete, and I am happy to be paying less and getting way more no matter which company I choose.

For big time enthusiasts who are single thread bound the 9900k seems legitimate, but for anyone other than the top 0.01% the 2600x or 2700x are way better buys. You can get a 2600x+motherboard+32gb (2x16) of DDR3200 C16 ram for less than a 9900k. Nevermind the cost of cooler, motherboard, and RAM for that 9900k. And AMD's stock cooler is adequate. It's really hard to say "5%-15% worse fps, maybe 50% in an avx heavy game, that might not even be relevant to your monitor is worth such a markup". Including the monitor and GPU is a bit dishonest when monitors tend to last much longer than computers and GPUs have their own tradeoffs independent of other components. Especially with zen3 looking to disrupt again in 2020 and Intel hopefully having a new desktop entry by late 2020 or sometime in 2021.

Khorne fucked around with this message at 06:02 on Jan 29, 2019

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

crazypenguin posted:

I think that's supposed to be a few days (3?) of output for that particular fab. Just a wild-rear end guess, but maybe this means a week of disruption?

I don't think that will mean any price disruptions or shortages or anything, just "aw poo poo we set a bunch of money on fire" for TSMC.

A fab will generally aim for 45k wafer starts per month per line. No idea how big this one is in comparison, or how many lines it'll be able to support. It's a big expensive fuckup, especially if they have to take machines offline to clean the hilariously deadly chemicals out of them.

Cygni
Nov 12, 2005

raring to post


like 80% of this post and its opinions/statements are baffling to me

Khorne
May 1, 2002

Cygni posted:

like 80% of this post and its opinions/statements are baffling to me
Yeah, it's very disjointed. I'm pretty tired and read 5+ pages at once and then wrote that.

I was aiming for

  • AMD being competitive has made Intel release far more competitive CPUs in the midrange
  • Intel was real lazy from 2012-2018 then released the 9XXX cpus which are nice. Honorable mention to the 5820k and 8700k as well.
  • As long as you have a high end 6+ core CPU from Intel that clocks high or a 2600x/2700x from AMD you're going to get about the same frame rate in almost any game unless you want 240Hz at 1080p. Then it's only Intel.
  • Zen2 will close that 50% gap down to a handful of % in that one title, and in the titles where AMD is within 15% already (most titles) it should also close the gap to the point where price or more cores will decide things.
  • Zen3 is going to be another tick to zen2's tick and is scheduled for 2020 as of now. That's nuts.

And I somehow injected a lot of personal anecdotes (2600k/3770k being competitive with more expensive non-k cpus from 5 years later) and my preference for low graphical settings despite having the hardware to run at far higher than that. I literally play on a mix of low and high in titles from 19 years ago. There's no deciphering that. It's permanently baffling.

Khorne fucked around with this message at 06:33 on Jan 29, 2019

PC LOAD LETTER
May 23, 2005
WTF?!

Methylethylaldehyde posted:

A fab will generally aim for 45k wafer starts per month per line. No idea how big this one is in comparison
Article said 100K+ wafers a month supply from the fab and they trashed around 10K wafers with this mistake so its quite the oopsie but its not going to make a big dent in the market.

PC LOAD LETTER fucked around with this message at 06:35 on Jan 29, 2019

Cygni
Nov 12, 2005

raring to post

Khorne posted:

Yeah, it's very disjointed. I'm pretty tired and read 5+ pages at once and then wrote that.

I was aiming for

  • AMD being competitive has made Intel release far more competitive CPUs in the midrange
  • Intel was real lazy from 2012-2018 then released the 9XXX cpus which are nice. Honorable mention to the 5820k and 8700k as well.
  • As long as you have a high end 6+ core CPU from Intel that clocks high or a 2600x/2700x from AMD you're going to get about the same frame rate in almost any game unless you want 240Hz at 1080p. Then it's only Intel.
  • Zen2 will close that 50% gap down to a handful of % in that one title, and in the titles where AMD is within 15% already (most titles) it should also close the gap to the point where price or more cores will decide things.
  • Zen3 is going to be another tick to zen2's tick and is scheduled for 2020 as of now. That's nuts.

And I somehow injected a lot of personal anecdotes (2600k/3770k being competitive with more expensive non-k cpus from 5 years later) and my preference for low graphical settings despite having the hardware to run at far higher than that. I literally play on a mix of low and high in titles from 19 years ago. There's no deciphering that. It's permanently baffling.

sall good dude you aint gotta answer to me, im a dipshit. sorry if i came across as a dick, i was really just confused by the low settings thing cause im the total opposite, ha.

what sorta games specifically do you play if you dont mind me askin? competitive shooter types?

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

PC LOAD LETTER posted:

Article said 100K+ wafers a month supply from the fab and they trashed around 10K wafers with this mistake so its quite the oopsie but its not going to make a big dent in the market.

There's going to be a LOT of 8Ds and disciplinaries thrown around in that factory for the next couple of weeks though

Working in a similar industry, I really feel for the people there, atmosphere must be loving horrible in there right now

It's always the small operators getting it in the neck while the middle level managers get away scott-free despite being responsible for creating a situation where such a monumental gently caress-up was allowed to happen

I'm not mad or anything!!

Truga
May 4, 2014
Lipstick Apathy

repiv posted:

Has anyone done an in-depth test of Turings encoder yet? Nvidia was claiming better quality than x264s Fast preset, which is impressive if true.

gpu encoders still depend on having a decent bitrate to produce good results. x264 will produce better results when streaming unless your bandwidth is high enough to not matter or you're saving to disk and have loads of space.

nvenc has been saying they're better than whatever x264 preset by citing "delta error from raw video" tables since at least maxwell, but that doesn't necessarily say anything about the visual quality to a human viewer.

Khorne
May 1, 2002

Cygni posted:

sall good dude you aint gotta answer to me, im a dipshit. sorry if i came across as a dick, i was really just confused by the low settings thing cause im the total opposite, ha.

what sorta games specifically do you play if you dont mind me askin? competitive shooter types?
All kinds of games. If I'm being more honest, there are plenty of settings I turn up. I'm just not a fan of anti-aliasing, bloom, most reflection stuff especially the one that makes everything look coated in semen, and settings like those. I don't want my whole screen shaking and obscured by particles. Lots of times I'll play with high environment textures and low player model textures or the opposite, but in games where it makes no difference I might as well have them all turned up.

In something like Anno I crank them up as much as reasonable, because it doesn't have any annoying effects and you're not really 'looking' at the screen much unless it's one of those quests to find a guy. In minecraft you paradoxically get more fps with the graphics setting on high vs fast.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

Truga posted:

gpu encoders still depend on having a decent bitrate to produce good results. x264 will produce better results when streaming unless your bandwidth is high enough to not matter or you're saving to disk and have loads of space.

nvenc has been saying they're better than whatever x264 preset by citing "delta error from raw video" tables since at least maxwell, but that doesn't necessarily say anything about the visual quality to a human viewer.
They actually cite subjective image quality now (after citing PSNR lol): https://www.nvidia.com/en-us/geforce/news/geforce-rtx-streaming/

I'm suspicious of whether it's actually that good, but I've watched a few Twitch streams that have been using the new Turing encoder and they looked fine, so it's at least not noticeably worse.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
AMD amends wafer agreement again for 2021

https://www.anandtech.com/show/1391...m_medium=social

Highlights are that AMD now pays no royalties or fees for using another fab, there is a potential end date for their relationship (2024). They still owe GloFo a difference on planned versus bought wafers for a given fiscal year but I honestly don't think that's going to be an issue if Zen2 and Rome are any success, plus other potential products as 14/12nm are still useful nodes.

SwissArmyDruid
Feb 14, 2014

by sebmojo

EmpyreanFlux posted:

AMD amends wafer agreement again for 2021

https://www.anandtech.com/show/1391...m_medium=social

Highlights are that AMD now pays no royalties or fees for using another fab, there is a potential end date for their relationship (2024). They still owe GloFo a difference on planned versus bought wafers for a given fiscal year but I honestly don't think that's going to be an issue if Zen2 and Rome are any success, plus other potential products as 14/12nm are still useful nodes.

Oh, is THAT why AMD's stock price jumped 20% yesterday.

edit: Oh, no, that's because they reported earnings and 23% increased revenue YOY.

https://www.tomshardware.com/news/amd-earnings-4q-2018-ryzen-epyc,38523.html

edit edit: More poking into yesterday's earnings. Navi to launch in some form or another this year.

quote:

.. as we see the GPU business right now, we see the first quarter as the low point in the business with the channel getting improving as we go into the second quarter. And we have additional product launches there as well. So that's the way we would see the portfolio.

quote:

Our gaming growth will be driven by new products. We would see that as we go through this year and with our Radeon 7 launch, as well as our Navi launches on the gaming side.

https://www.overclock3d.net/news/gpu_displays/amd_s_lisa_su_confirms_that_navi_will_have_launches_in_2019/1

SwissArmyDruid fucked around with this message at 22:52 on Jan 31, 2019

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


It went up after they posted earnings. Look at the after market graph.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

More poking into yesterday's earnings. Navi to launch in some form or another this year.

https://www.overclock3d.net/news/gpu_displays/amd_s_lisa_su_confirms_that_navi_will_have_launches_in_2019/1

That sounds like Navi is a June launch, so I think AMD's going huge at Computex this year, Ryzen 3000 AM4 and Navi 10/12 launch, probably Rome launch, and likely Threadripper 3000 announcement. I think that puts Navi APUs and Navi 14 at CES 2020, as they'd have to do their own even for that after Computex and I find that unlikely.

Das_Ubermike
Sep 2, 2011

www.oldmanmurray.com
What’s the general consensus on the cooler that comes attached to the 2700X? Does it need to be replaced or is it adequate for its function if you have no plans to OC the processor?

GRINDCORE MEGGIDO
Feb 28, 1985


It performs quite well, it's a bit noisy when it ramps up.

Das_Ubermike
Sep 2, 2011

www.oldmanmurray.com

GRINDCORE MEGGIDO posted:

It performs quite well, it's a bit noisy when it ramps up.

That’s ok, I’m wearing headphones most of the time when I’m gaming so I probably won’t hear it too much. Thanks buddy.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Yeah, it's around the same performance as an olde Hyper 212 IIRC, and a 2700X will boost to specs even if it's a little hot. Like, I think you need to hit some 140mm tower monstrosity to see a difference worth your money.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
From my experience any fan that spins at 2000 rpm is pretty noticeable and the Prism peaks at 2700 rpm.

90s Solo Cup
Feb 22, 2011

To understand the cup
He must become the cup



Never had a chance to test out the Wraith cooler. I went straight to the NH-D15S :getin:

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop
Looking to build a box for virtualization: host+guest GPU, NVMe, 10GbE. Probably means I'm looking at a TR2 2950x.

The rumored 12/16 core ryzens sound great but I think I'll be PCIe lane starved because I'm basically building two machines into one. Hoping with some research and setup I can stick the guest OS on a CCX with it's own local RAM and PCIe lanes.

Of course, I really want a zen2 threadripper but those aren't even in the speculation pipeline yet.

E: The "why not just build two machines" reason is when I'm not using the guest I have one stupidly powerful machine instead of a normal box and a doorstop.

GRINDCORE MEGGIDO
Feb 28, 1985


Balliver Shagnasty posted:

Never had a chance to test out the Wraith cooler. I went straight to the NH-D15S :getin:

I ran mine on it for two days, until WC arrived. I became weirdly fond of the bling.

Setset
Apr 14, 2012
Grimey Drawer

Harik posted:

Looking to build a box for virtualization: host+guest GPU, NVMe, 10GbE. Probably means I'm looking at a TR2 2950x.

The rumored 12/16 core ryzens sound great but I think I'll be PCIe lane starved because I'm basically building two machines into one. Hoping with some research and setup I can stick the guest OS on a CCX with it's own local RAM and PCIe lanes.

Of course, I really want a zen2 threadripper but those aren't even in the speculation pipeline yet.

E: The "why not just build two machines" reason is when I'm not using the guest I have one stupidly powerful machine instead of a normal box and a doorstop.

New Zen2 chipset uses PCIE 4.0 I believe, which has double the bandwidth. Can get away with just 8x lanes for the same throughput - if you even need that much

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Harik posted:

Of course, I really want a zen2 threadripper but those aren't even in the speculation pipeline yet.
They usually come half a year later. If desktop Zen 2 is slated for June as some rumors claim, you can plan for a Christmas present or something.

90s Solo Cup
Feb 22, 2011

To understand the cup
He must become the cup



GRINDCORE MEGGIDO posted:

I ran mine on it for two days, until WC arrived. I became weirdly fond of the bling.

I will say this about the Wraith cooler: it makes the stock Intel cooler that came with my 3570K look downright cheap and wimpy in comparison.

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Harik posted:

Looking to build a box for virtualization: host+guest GPU, NVMe, 10GbE. Probably means I'm looking at a TR2 2950x.

The rumored 12/16 core ryzens sound great but I think I'll be PCIe lane starved because I'm basically building two machines into one. Hoping with some research and setup I can stick the guest OS on a CCX with it's own local RAM and PCIe lanes.

Of course, I really want a zen2 threadripper but those aren't even in the speculation pipeline yet.

E: The "why not just build two machines" reason is when I'm not using the guest I have one stupidly powerful machine instead of a normal box and a doorstop.

You're probably more cost-effective going with a Epyc cpu. Threadripper are some of their highest-clocking chiplet bins, if you don't need that level of clockspeed you can save a fair bit of cash and still get plenty of cores. There's a number of atx boards available for the socket. More memory channels too.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Harik posted:

Looking to build a box for virtualization: host+guest GPU, NVMe, 10GbE. Probably means I'm looking at a TR2 2950x.

I've got this setup running right now and after the initial hiccups with AMD breaking Linux with the same Agesa update that enabled support for TR2 it's been mostly smooth sailing. I'm assuming you're doing this for gaming in a Windows VM with a Linux host, in which case there are plenty of good resources on level 1techs or if you google for vfio. It's really not too difficult now, and the most important thing is buying the right motherboard with good IOMMU groups.

From personal experience I recommend going with the ASRock Taichi board, I went with the zenith extreme and it hasn't been stellar. It's recommended that you stay away from MSI boards as they're not known for good IOMMU groups. For 10GbE the cheapest option is picking up some used mellanox connectx-2s on ebay, though it gets more expensive if you want more than just a direct connection between two machines.

GRINDCORE MEGGIDO
Feb 28, 1985


Balliver Shagnasty posted:

I will say this about the Wraith cooler: it makes the stock Intel cooler that came with my 3570K look downright cheap and wimpy in comparison.

Totally. I'm going to downsize my case soon and might just use it.

Mr Shiny Pants
Nov 12, 2012

Desuwa posted:

I've got this setup running right now and after the initial hiccups with AMD breaking Linux with the same Agesa update that enabled support for TR2 it's been mostly smooth sailing. I'm assuming you're doing this for gaming in a Windows VM with a Linux host, in which case there are plenty of good resources on level 1techs or if you google for vfio. It's really not too difficult now, and the most important thing is buying the right motherboard with good IOMMU groups.

From personal experience I recommend going with the ASRock Taichi board, I went with the zenith extreme and it hasn't been stellar. It's recommended that you stay away from MSI boards as they're not known for good IOMMU groups. For 10GbE the cheapest option is picking up some used mellanox connectx-2s on ebay, though it gets more expensive if you want more than just a direct connection between two machines.

Which firmware you running? I decided against upgrading because of some problems with Agesa. Are they fixed?

Do you still need the PCIe reset patch for pass-through?

Klyith
Aug 3, 2007

GBS Pledge Week

EmpyreanFlux posted:

Yeah, it's around the same performance as an olde Hyper 212 IIRC

The wraith prism does absolutely not match the performance of 120mm tower coolers if you are also measuring noise. Like maybe it can keep up with a hyper 212 in delta-T, but heatsink performance ignoring noise is a bad comparison.


That said, it's a completely usable heatsink! Normal users can use the low fan setting (which is still louder than many 120mm towers) for games and workstation programs and not be thermally downclocked or deafened. If I got a cpu with one of them I'd still replace it, noise is a priority for me. But I'd totally keep it in reserve for a secondary build or small case system.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Mr Shiny Pants posted:

Which firmware you running? I decided against upgrading because of some problems with Agesa. Are they fixed?

Do you still need the PCIe reset patch for pass-through?

I'm running the 1501 firmware for the zenith extreme, which isn't the newest one but everything seems to work. The only specific problem I'm aware of with the newest Agesa (which is still the TR2 version) is it advertising support for an EPYC-only feature, and causing then-new Linux kernels to hang. You could compile a kernel without support for that feature to work around it, but it really hosed me over when I was trying to get root on ZFS working, since I had never worked so closely with Linux's boot process before.

The PCIe reset thing was fixed in one of the newer Agesa versions, either the one with support for TR2 or the one before it. It took AMD like a year to fix it though. Ryzen having a much larger install base and getting more attention with respect to bug fixes is a potential reason to go with Ryzen over Threadripper.

Except for those two issues my only other significant problems have been Asus specific which is why I recommend the Taichi. The specific issues are not being able to choose which GPU is used for the BIOS, and the board wouldn't POST with two graphics cards if one was Turing for over a month after those cards were released.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Klyith posted:

The wraith prism does absolutely not match the performance of 120mm tower coolers if you are also measuring noise. Like maybe it can keep up with a hyper 212 in delta-T, but heatsink performance ignoring noise is a bad comparison.


That said, it's a completely usable heatsink! Normal users can use the low fan setting (which is still louder than many 120mm towers) for games and workstation programs and not be thermally downclocked or deafened. If I got a cpu with one of them I'd still replace it, noise is a priority for me. But I'd totally keep it in reserve for a secondary build or small case system.

I don't disagree with this post, noise isn't usually a primary issue for me and I tend to think of it less as a performance metric than maybe I should.

Adbot
ADBOT LOVES YOU

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Built my fourth compute node today. Went to Microcenter and picked up a 1600 for $99 to make it happen.

I wanted more cores, but I realized that I'm going to upgrade the CPU in all four machines as soon as the 3x00 series comes out, so I decided to go cheap for now.

I'm really, really hoping for a 65W part with 12 cores, as that would give me 48c/96t across my tiny compute farm.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply