Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Arzachel
May 12, 2012

Wirth1000 posted:

Death to cryptominers.

Adbot
ADBOT LOVES YOU

Arzachel
May 12, 2012

craig588 posted:

TSMC has good processes, but it can't be overstated how far ahead Intel is.

Was.

Arzachel
May 12, 2012

necrobobsledder posted:

So I guess I’ll hobble forward with my E3-1230 Sandy Bridge to pair with my 1080 Ti given the ludicrous prices. I had bought RAM and everything too when I had gotten a good deal on 2 sticks of 16GB DDR4 3200. Google will be getting more of my money than Intel in 2018 then. Bummer, have to figure out what to do with this unopened RAM then. Hell, maybe Samsung and Hynix will keep raising prices and I should hold? And why does this remind me more of day trading than building computers anymore?

Intel seems to be trying to pull an Apple by testing the price sensitivity of its customer base. Thing is, the release event seemed aimed more at trying to appeal to rich and completely clueless gamers rather than content creation professionals or engineers. So I have no clue what is going on over at team Blue.

I'm sure Intel isn't exactly thrilled about this either. Their 10nm process is hosed which is causing massive shortages on 14nm capacity.

Arzachel
May 12, 2012

quote:

For several years now SemiAccurate has been saying the the 10nm process as proposed by Intel would never be financially viable

Pad everything with weasel words and you're never wrong!

Arzachel
May 12, 2012
Temperature also increases leakage, so you end up having to dissipate more power.

Arzachel
May 12, 2012

Stickman posted:

I certainly wouldn’t. You’re going to spend more on a motherboard unless you limit yourself to locked boards (and thus only terrible upgrade options), multi core performance is much worse which may matter for some games down the road, and AM4 will have drop-in support for Zen 2 and probably 3 when your processor starts feeling anemic.

E: 6/6 will start having minimum frametime issues before 6/12, too.

The 2600x also comes bundled with actually usable cooling which is a pretty big deal when the budget's tight.

Arzachel
May 12, 2012
The mid-cycle refresh consoles drew well over 100w and with the rumoured $500 price, I'm pretty sure it'll stay that way.

Arzachel
May 12, 2012

eames posted:

8700K was a good buy on launch day and is still holding up very well but I suspect its viability will drop off a cliff once game engines start to hammer 8 threads simultaneously and/or more security vulnerabilities force even consumers to disable HT. I'm hoping that DDR5 platforms are out by then, warranting a complete upgrade.
Right now I feel like gamers shouldn't buy a new Intel CPU + Motherboard over Zen 2 until the price cuts and/or some super high FPS edge cases.

Remember that Intel decided to put off implementing the latest batch of mitigations until after Zen2 launch reviews.

Arzachel
May 12, 2012

DrDork posted:

I'd ask why you can also buy a 1600x900 17" Precision 7740 workstation laptop from Dell, but the answer is inevitably "because it saves $34 and accounting said buy the cheapest option."

LPDDR4X at that speed is real expensive. The next model up is 400 bucks more with more memory being pretty much the only change in specs.

Arzachel
May 12, 2012

redeyes posted:

I got that i3 9100F processor for $90. Works fine mostly, It doesn't seem to turbo to 4.2Ghz. Most I've seen is around 4.03Ghz. Odd... I wonder if there is something I don't know about these new processors.

The stock heatsink is complete garbage, if you're using that. Also, getting a 4/4 CPU in tyool 2019 is a brave decision.

Arzachel
May 12, 2012

Cygni posted:

Talking about competitive balance long term, if Intels 7nm actually shows up on schedule (lol) in 2021 or 2022, it is very likely to wreck everyones poo poo. The density numbers on it are ridiculous. More than double the density of Intels 10nm, TSMCs 7nm+, or Samsungs "7nm" EUV (which are roughly equal), and 50% more dense than TSMC "5nm".

But TSMCs "5nm" is supposed to be already a year old by then, so AMD really does need to keep their foot on the gas while they can. They do have the problem of being a relatively low profit part on TSMCs nodes though, which means they normally don't get a crack until after Apple. Ditching Glofo is a double edge sword I guess. They wouldn't be here without TSMCs 7nm, but now they have to rely on someone else to supply them... someone who does business with bigger fish than AMD.

Every new process node is the best poo poo ever until it has to go into production :v:

Arzachel
May 12, 2012

Paul MaudDib posted:

You also have to do the math on whether it makes sense to just buy the faster CPU in the first place.

If you paid $220 for a 1600 at launch, you upgrade 2.5 years later to a 3600 for another $200 and sell your 1600 for $50 after ebay fees/shipping (they go for $80 new so that's being generous and assuming you can get near-new valuation). You've now spent $370 net, on something that isn't even as fast as an 8700K.

AMD has convinced people that paying them money every year or two is a virtue somehow. If you're going to a much more high-end chip than you originally bought then sure, it makes sense, but if you are just buying a low/mid chip every 2 years just to upgrade then you probably were better off sucking it up and paying Intel their blood money.

You also kinda have to consider the realities of running a 5+ year old board. The feature set gets dated on older boards. 5 years ago was Haswell, you couldn't necessarily boot off NVMe, you had maybe 2 USB 3.0 (gen 1) ports that ran off a non-bootable addon chipset, certainly no M.2 ports, were running DDR3, etc. You have a lot of wear and tear on the VRMs and caps, do you really want to strain it by tossing a higher-end processor on it. What's the expected service life, are you gonna run this thing to a full 10 years? That kind of thing. The idea that a cheap consumer board is going to happily and reliably go to 10 years of regular operation is... optimistic. Some make it, some don't.

There is no guarantee that the next gen will work on previous boards either. If you bought a 300-series board this could be the last stop for you. It already is for some of the AM4 boards, and for all TR4 users. You are betting on socket compatibility and the guarantees AMD is making aren't as strong as people think they are (again, new chips don't work on all old boards and old chips don't work on new boards, that is already the status quo). There is no guarantee you can put a Zen3 on your B350 board or whatever.

Eh, people are still running overclocked Sandy Bridge cpus with no noticeable wear on the motherboards and there's been zero features since the release of AM4 that I care about. Totally agree that incremental mid-range updates aren't a great value proposition compared to gpus (and even those are getting sketchy) but it can make sense if you don't have the money to go for a $500 cpu up front.

Arzachel
May 12, 2012

Statutory Ape posted:

Hi I am excited for what tiger lake will do for Ultrabooks and probably tablets

Where can I go to learn more about whatever info there is on these chips

Uhhh, Computex probably? Ice Lake only launched a couple months ago so what little they teased in CES is all we're getting for a while.

Arzachel
May 12, 2012

DrDork posted:

Yeah, the segment filled by the NVidia 1650 and below is surprisingly large, and anything that can get even to 1050 levels would be able to capably play a lot of net cafe and other lower end games that, frankly, a whole fuckton of people worldwide dump a lot of play time into.

Die size on the smaller chips tends to be pad limited so a new product to fill the niche between iGPUs and a 1650 just isn't financially viable when discounted older products and the used market exists.

Arzachel
May 12, 2012

SwissArmyDruid posted:

It absolutely shouldn't. For comparison's sake, a 750ti was capable of 1.4 TFLOPs single precision.

Right now, at THIS VERY MOMENT, Ice Lake G7 clocks in somewhere between 1.0 and 1.1 TFLOPs, depending on cooling and configuration.

I don't know why you guys think this is some kind of unattainable goal that needs voodoo and HBM and chiplets. It's there! It's right loving there! It's so close!

Memory bandwidth.

Arzachel
May 12, 2012
Raven Ridge officially supported up to ddr4 2933 which is about 23GB/s, lpddr4x 4266 does about 34GB/s or 50% more, making it real obvious where the 59% performance increase comes from.

A 750ti does 86GB/s.

Arzachel
May 12, 2012

SwissArmyDruid posted:

And yet one is benched as being capable of about 10% fewer FLOPS than the other. Gee. It's almost like IPC and transistor count between a 28nm process and a 10nm process actually *means* something.

I messed up and the numbers should be doubled for dual channel, so it's not nearly as grim as I thought and Renoir should probably beat a 750ti as long as you're using real fancy memory :toot:

Realistically, AMD's apu designs will still be heavily memory bandwidth limited until ddr5.

Arzachel
May 12, 2012

ConanTheLibrarian posted:

I was surprised by this and looked up some performance comparisons. It's crazy how close A13 is to Intel and AMD's desktop CPUs. Does anyone have any insights regarding how they've squeezed that much performance out of ARM cores (especially considering its clocks are way lower than desktop CPUs)? E.g. a more efficient ISA?

Infinite money and being able to optimize around a very narrow TDP range and core count. I don't think the ISA matters much, especially since Apple are cracking ARM instructions into micro ops.

Arzachel
May 12, 2012

Rinkles posted:

I am disappointed in the current* state of Intel integrated graphics. I wasn't expecting miracles with this laptop (Kaby Lake i5 , UHD 620), but I thought some older stuff would play decently. Diablo 3 doesn't hit a consistent 30fps even at 720p. And I hover around 20 with Titan Quest AE at minimum details 1080p. Were my expectations unreasonable?

(*this is a 1.5 year old laptop)

e:it has 8gigs of memory

I bet that's a single stick so it's running in single channel mode.

Arzachel
May 12, 2012

Cavauro posted:

It still seems true in a conversation strictly about fps

Games are very sensitive to memory latency once you're pushing high frame rates. Zen3 will probably bring another increase in IF speed but it's at a fundemental disadvantage compared to Intel's integrated memory controller. But if you can fit the whole hot path into L3, that disadvantage goes away. We already saw glimpses of that with CSGO on Zen2 and Zen3 is doubling the amount of L3 available to a single core.

Arzachel
May 12, 2012

gradenko_2000 posted:

anandtech did a recent retrospective on Broadwell and I think they found it still punches way above its weight due to the cache

I just mentioned this in the AMD thread but Anandtech uses maximum supported memory frequency and JEDEC timings which is probably going to favor Broadwell.

Arzachel
May 12, 2012

Twerk from Home posted:

That's a better deal than any Zen 3 at MSRP, just have a big fan.

If you get a similar discount on the board, maybe.

Arzachel
May 12, 2012

Zeta Acosta posted:

What's the Intel equivalent of the 3600x?

There is none because Intel limits memory speeds on their mainstream chipsets.

Arzachel
May 12, 2012

VorpalFish posted:

IIRC at least for gaming comet lake with 2933 is faster than zen2 with 3600.

Edited: accidentally wrote rocket lake instead of comet lake

The overclockable/higher turbo models maybe, but the 10400 is locked at 2666 MHz memory so a 3600 with a half decent memory kit ends up sweeping both gaming and all core benches. Really hoping Intel doesn't kneecap their mid-range chipsets again with Rocket Lake.

Arzachel
May 12, 2012

Not Wolverine posted:

So, the market for the Core i7-11700K is users who want 8 cores and 16 threads, have a budget for a top of the line CPU, need maximum single core performance for MAX FRAMEZZZ in single threaded games from 2-3 years ago, and have no need for multi-core performance. The only meaningful use I can think of for this turd is to be able to run your 5950 stock checking bot faster than everybody else, assuming that even is a single threaded task.

There's probably more people who could use the single thread performance than >8 cores tbf and the naming scheme makes me think it's going to be priced accordingly. AMD has some real obvious gaps in the 5xxx series lineup (until the inevitable 5600 non-X/5700X) so it can be good value as long as the price is good and Intel doesn't cripple the memory on midrange chipsets again.

Arzachel
May 12, 2012

Not Wolverine posted:

Let me Google that for you.

Yes, it's true some games will still achieve higher frame rates with single threads, yet more and more games today are making use of multiple cores. The Xbox 360 had a 3 core PPC CPU, the PS3 had a 7 core turd Cell CPU, the PS4 and Xbone had 8 core Ryzen CPUs. If it's impossible to make a multi-threaded game engine, why are Sony and Microsoft increasing the core count? More importantly developers are becoming lazy and porting over PS4/Xbone releases to PC which means the higher core count of a console can (in theory. . ) lead to multi threaded PC ports.

Never mind 8c/16t, how many games see significant performance scaling past 4c/8t?

Arzachel
May 12, 2012

SwissArmyDruid posted:

Wait, are B-series boards seriously still locked to 2933?!

And 2666 on the 10600 and below!

Shrimp or Shrimps posted:

I mean, the 10700K already matches or slightly beats the 5800x in averaged gaming performance on most review sites/tubes? It's hardly outlandish to think Intel might retake The Gaming Crown with the 11th gen. What exactly is the argument here, that you need 12 cores for gaming? Your per-thread performance is going to limit you far before your core count does, anyway.

You'll have to link some benches, Comet Lake struggles to keep up in everything except RDR2 as far as I've seen.

Arzachel
May 12, 2012

Wiggly Wayne DDS posted:

they trade blows, where are you seeing zen3 overtaking every title bar RDR2?

Gamers Nexus, specifically their tuned 5600x vs 10600k comparison.

Shrimp or Shrimps posted:

TPU has it a little faster: https://www.techpowerup.com/review/amd-ryzen-7-5800x/16.html Off the top of my head HBU had it only a little slower than the 5800x, like 5% or something. So I should have been more clear in my previous post, either slightly beating or slighting losing to.

Either way, the original point that wolverine guy made that the 11700k would be a poo poo purchase for gaming and he inferred that a 5900x would be better because "lol only 8c/16t can only run 3 year old games" when by the time an 11700k can't run a game, the 5900x won't be doing much better.

Yeah, the margins are small but that's expected when low double digit performance increases are considered a big deal nowadays.

Fully agreed that getting >8c/16t CPU purely for gaming is silly though.

Arzachel
May 12, 2012

gradenko_2000 posted:

Gonna laff if Tiger Lake beats Ryzen mobile because AMD refuses to put Vega out of its goddamn misery

If reusing the iGPU let them pull in Cezanne by a quarter or two, I feel that's completely worth it.

Arzachel
May 12, 2012

BurritoJustice posted:

In practise it seems like a crude way of making the 11900K seem much better compared to the 11700K out of the box, given that if you bench them with stock settings there will be more of a measurable difference than there ought to be

I still maintain it was a bit of a whiff by Anandtech to not mention or test this in their review, their numbers basically aren't relevant to the actual end user. People have been happily running them at 1:1 at 3600+ with no additional effort

Anandtech uses jedec memory for all their testing. I agree that that makes low res gaming results not very relevant for people who would care about them but they're up front about it.

Arzachel
May 12, 2012

Pablo Bluth posted:

Have Intel said anything about moving to a chiplet concept? As it stands, chiplets just seems such a more nimble concept for ramping up the core count.

Yes, for GPUs :v:

Arzachel
May 12, 2012

ConanTheLibrarian posted:

Hasn't Intel already talked about moving to multi-chip modules? If they could combine 10nm cores with 14nm IO chips, it could compensate for their 10nm yield issues.

Lakefield is/was a 10nm compute die (1+4 big/little CPU cores with a iGPU) stacked on top of a 22nm IO die. I've seen zero news or information about it since it was announced last summer which isn't a great look.

Intel was significantly ahead on packaging tech up until very recently, it just hasn't materialized into a good product for whatever reason.

Arzachel
May 12, 2012

SourKraut posted:

I just don't get the amount of vitriol and anger it causes people; if someone doesn't like it, that's fine, just don't buy it or turn it off if the hardware does have it. Doesn't seem worth the venting/stress/etc.

Imagine coming into the thread to post "Just turn it off lol" after a page describing why "Just turn it off lol" often isn't a great solution.

Arzachel
May 12, 2012

DrDork posted:

There are plenty of devices that can function without needing problematic software running, as well. Install software, set everything up how you like it, then delete software is a valid option in a lot of cases.

Motherboards might have that option (and generally have the least poo poo RGB implementations) but I don't think I've ever seen a GPU or memory module save it's LED settings.

Arzachel
May 12, 2012

Indiana_Krom posted:

What helped TSMC succeed with 7nm where Intel failed is TSMC is using EUV "extreme ultraviolet" lithography for critical layers where Intel is trying to do it only with DUV "deep ultraviolet" lithography. Basically if the ultraviolet you use is a marker, Intel is trying to use a 193nm wide marker tip to draw a 10nm wide line where TSMC is using a 13.5nm wide marker instead. Everyone was using 193nm down to about 12-14nm, there are a lot of tricks and workarounds to make it work down to those sizes, but the difficulty goes up exponentially as the size decreases and TSMC/samsung/etc simply waited till the machines that work with 13.5nm became available before they attempted anything smaller.

As far as I know TSMC's 7nm (N7) and high performance (N7P) are DUV only.

Arzachel
May 12, 2012

Wild EEPROM posted:

yeah now they just use propriatary connectors too and the power supply only supplies 12v with the motherboard providing the other voltages

granted the power supplies are high quality, but

I'd love to get a 12v-only board and psu for sff builds but it seems to be limited to oem use.

Arzachel
May 12, 2012
The 11400F is the best pick right now, close enough to the 5600X and priced a good deal cheaper. The biggest improvement over the 10400F is that you can finally get rated memory speeds even on b560 boards.

Arzachel
May 12, 2012

SwissArmyDruid posted:

Does this mean we're threatening to merge the AMD and Intel threads but not actually do it again?

I honestly wouldn't mind seperate AMD/Intel/Nvidia tech support and buying advice threads and a merged pc hardware news thread

Arzachel
May 12, 2012

BurritoJustice posted:

You see enough people taking the numbers literally and thinking TSMC 7nm is a generation ahead of Intel 10nm ESF when it's mostly on par.

So you're saying the naming was correct and they shouldn't have changed it? :v:

Adbot
ADBOT LOVES YOU

Arzachel
May 12, 2012

Cygni posted:

The change doesn't really benefit the enthusiasts, so it is a bit of a net negative for DIY gaming/workstation folks specifically.

The efficiency gains are negligible to non existent for a high power gaming computer (it is really just designed for bulk OEM to meet bulk standards), it is going to increase motherboard costs and complexity likely without any drop in power supply costs, it may make the entire compatibility situation much more complicated for those that still have 3.3v/5v devices ("oh no, this cheap motherboard only has 1 sata power plug!"), you are leaving the 3.3v/5v regulation in the hands of 5-6 companies that basically all have massively hosed up voltage circuitry selection at one point or other on their current products, and it is sorta halfassed for enthusiasts in that it didnt actually consolidate the number or ease of use of the cables.

In the end, I don't actually think it will be that bad because I think the DIY space is just going to ignore the standard for years to come. But it will likely be annoying for all of us who do computer janitor duties for friends and family.

Smaller PSUs and less annoying cabling sounds great for anyone building an itx system tbf.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply