|
GamersNexus posted their R7 2700 benchmarks.
|
# ¿ Apr 19, 2018 14:31 |
|
|
# ¿ Apr 19, 2024 18:52 |
|
Bottom end motherboards really do have underdimensioned VRM's, though, and they really do limit performance. Things like the Asus Prime Z370-P comes with a VRM that tops out at a sustained power draw of about 120-130 watts, and just enabling all-core turbo on an i7-8700K at stock clocks can easily draw 150W or even more in stress tests, unless the chip is a top bin overclocker to begin with. I guess enabling all-core turbo is technically overclocking, but most people wouldn't see it that way. To actually run an "average" clocking 8700K with all cores at max stock turbo for an extended period of time at all, you need a board that's pretty solidly into the mid-range of the market, and if you don't have airflow over the VRM's and plan to do extensive compute work like video encoding or Blender CPU rendering or whatever, you might want even more than that if you plan on keeping the system for five years or more like many people do these days. Buildzoid is also very clear on when he thinks a VRM is way overkill for air or even liquid cooling, by the way. You really don't need to buy top end boards for a 24/7 home overclock and I really don't think he says that either.
|
# ¿ May 20, 2018 23:45 |
|
The Asrock X470 Master SLI's VRM is, uh, pretty wonky to say the least: https://www.youtube.com/watch?v=mjoa2mU4uMw It doesn't have working overtemp protection on the VRM and when the temp sensors reaches 125° C it rolls over to zero This is on a testbench with no airflow over the VRM at all though so not really a scenario most people will run into, but still, that's pretty bad! TheFluff fucked around with this message at 18:16 on Jul 26, 2018 |
# ¿ Jul 26, 2018 18:13 |
|
It's depressing to realize that if you want a good motherboard you pretty much gotta give up and give Asus like 300 loving bucks I mean, some of Asrock's high end stuff is also good, at least as far as power delivery goes, but the BIOSes really do seem worse over the board. Asus' midrange and low end stuff is pretty lovely and overpriced to boot, too.
|
# ¿ Jul 26, 2018 20:20 |
|
GRINDCORE MEGGIDO posted:It's concerning me tbh. AMD seems to be executing well - but if they don't get good bios support, it's going to piss people off. Buildzoid seemed to quite like the Asus Crosshair VII at the very least. He should, though, given that that's the highest end X470 board you can get.
|
# ¿ Jul 26, 2018 21:01 |
|
Craptacular! posted:So next year’s Threadripper will be packaged in old first generation Xbox cases. xbox huge joke? did I take a wrong turn and end up in 2002? is george w in the white house? where's my athlon xp at?
|
# ¿ Aug 2, 2018 23:43 |
|
CAPS LOCK BROKEN posted:Microcenter is offering the 1800X for $199. My i5-3570k is starting to feel a little dated and it would be nice to get back on the AMD train. Any reason why I should get the i5 8600k that's just 20 bucks more? If single thread performance is incredibly much more important to you than anything else, then the 8600K might be the right choice, but other than that, no. 8 cores/16 threads for $199 is a goddamned steal. Go for it, IMO.
|
# ¿ Aug 6, 2018 02:04 |
|
Also, if you're running with a closed loop watercooling solution you won't get much if any airflow over the VRM's naturally. With air cooling (standard tower cooler) you typically have a lot of airflow around the CPU socket area without having to do anything special. Some boards don't have overtemp protection on the VRM's either (see: Buildzoid's video on the X470 Master SLI - no LN2 involved there, just an AIO, quite modest voltages and power consumption, and the VRM exceeded 125°C within minutes). If you're buying high end CPU's with the intent of pushing them even a little bit, you really, really should buy high end motherboards as well. Or at least top-of-midrange or whatever. Even if it works and doesn't shut down under load, high temps do age things like capacitors rather quickly and the board might die in a year or two instead of lasting basically forever.
|
# ¿ Aug 11, 2018 15:37 |
|
Craptacular! posted:I bought a 120mm AIO to run at a machine at stock on quiet mode for five years. I know I’m overbuying on coolers. Buying a 110i/115i PRO for over $100 and putting it on a cheap motherboard looks silly from a performance standpoint, but performance isn’t all that matters so long as nothing overheats and dies. So that’s why I asked about the Gaming Pro Carbon AC, which is like $20 more than the Titanium but means stepping up to full ATX. (Newegg just debuted the Titanium this morning for $110, while the Carbon is $130.) For most intents and purposes, a doubled 4-phase is pretty much as good as an actual 8-phase. The point of having a multi-phase VRM is to let many different components share the current load - in an 8-phase VRM, each phase (essentially consisting of a pair of MOSFETs, an inductor and one or more capacitors) is only turned on 1/8th of the time. These things turn on and off very fast - 300 kHz switching frequency is pretty typical, while higher end VRM's can be optimized for 500 kHz. You want a higher switching frequency and more "real" phases to smooth out voltage ripples and reduce transient voltage spikes since that helps keeping the CPU stable when overclocking. A "fake" 8-phase with eight high/low MOSFET pairs and eight inductors but only four controller phases will turn on two "phases" at a time - each pair of phases will be turned on 1/4th of the time. For the most part it will behave like a "real" 8-phase thermally (since it splits the heat load over 8 phases' worth of components), but electrically it will be a 4-phase. A doubled 4-phase has the controller chip output PWM signals that turns on and off 4 doubler chips, and then each of those doubler chips will act as a pretty dumb 2-phase controller that turns on and off two phases alternately. Hence, you get 8 actual electrical phases, so it's almost as good as a "real" 8-phase. The only thing you usually miss out on is current balancing between the two phases in each pair because the doubler chips are usually too dumb to do that, but I think that's pretty much irrelevant in reality. Hope I managed to explain it somewhat. If you want the gory details I recommend Texas Instruments' Application Report SLVA882, Multiphase Buck Design From Start to Finish (Part 1). It's actually pretty approachable! e: WikiChips actually has an even more approachable explanation TheFluff fucked around with this message at 23:59 on Aug 11, 2018 |
# ¿ Aug 11, 2018 23:23 |
|
B-Mac posted:This post was really informative, thanks. When you say doubled are you talking about have two high side and two low side mosfets per phase, or is there more to it? Sorry, I got a bit unclear with the terminology. When I said "doubled 4-phase" in the post above I meant "an 8-phase that uses a 4-phase controller plus doubler chips". That is, the controller outputs 4 on/off signals (PWM signals), but instead of going directly into a MOSFET/inductor (a "phase"), the signal first goes into a doubling chip that turns it into two interleaved (alternating) signals. Hence, you get 8 distinct on/off signals and each MOSFET is turned on 1/8th of the time. You have 8 actual phases in every sense of the word, it's just that the controller chip only sees 4 phases and can only do current balancing on pairs of phases. It would perhaps be more accurate to call it a "8-phase with doublers"...? When I was talking about a "fake 8-phase", I really meant "a 4-phase that tries to look like an 8-phase by having two of everything in each phase". So, just as with the doubled 4-phase, you have 4 PWM signals coming from the controller, and you have 8 power stages (high/low MOSFETs) and 8 inductors, but they turn on and off in pairs, so each one of them is turned on 1/4th of the time instead of 1/8th of the time. You have the components for 8 phases but from a control standpoint you only have 4. Hope that clarifies it. I really need to come up with a term other than "phase" when I want to talk about the actual power circuitry so I don't have to go "a power stage, inductor and capacitor" every time. e: for clarity, a power stage is the high and low side MOSFETs combined into a single chip, usually plus some temp/voltage monitoring bits TheFluff fucked around with this message at 01:06 on Aug 12, 2018 |
# ¿ Aug 12, 2018 00:49 |
|
TorakFade posted:Edit 2: in testing more with AIDA64 stress test, I notice that whatever the setting, it'll pull only 1.35V to get an all-core frequency of 3.9/4.0 Ghz at full load. Which means that at idle (with AMD Ryzen balanced setting), it pulls more voltage (1.40-1.45) and overclocks faster (4.2Ghz all core) than at full load? I really don't understand, help guys
|
# ¿ Oct 10, 2018 11:13 |
|
if you're as confident as you sound in those oddly specific predictions you have nothing to lose, no?
|
# ¿ Oct 20, 2018 00:28 |
|
MarksMan posted:I was originally looking at an Intel Xeon E5-2650 v4 as the CPU for a deep learning rig, but have found a mobo that fits well with the specs I need (128gb ram, lots of PCIe slots for 10+ GPU's) -- the X399 Designare-EX -- but it only supports Threadrippers. What would an equivalent AMD processor to the Xeon e5-2650 be? The 1920X has 12 cores/24 threads like the E5-2650, but it's got almost double the TDP and over 50% faster clock speed, so it should perform significantly better. There's also the Zen+ refresh, the 2920X, which is the same thing but a bit better (better memory compatibility, a bit more power efficient, slightly faster), but it's not released quite yet - it launches October 29th. The 1920X is currently around $400 which is a pretty amazing deal for a 12-core CPU. If you were budgeting for a Xeon that costs (as far as I can tell) around $1200, there's really no reason not to go for a 16-core 1950X or 2950X instead. e: do note, I'm not sure what motherboard support for Threadripper 2 (2920X/2950X) is like on that particular motherboard. It might not work at all, or it might work with a BIOS update which might or might not be available and which might or might not require a Threadripper 1 to install. TheFluff fucked around with this message at 01:07 on Oct 21, 2018 |
# ¿ Oct 21, 2018 00:58 |
|
it's at least 700€ in the EU right now, but it's not in stock at that price Meanwhile the 8700K is still around 420€ (used to be 340€) and the 9700K is 500€.
|
# ¿ Nov 10, 2018 20:01 |
|
Paul MaudDib posted:only with PLX switches I know of exactly one: the Asus WS Z390 Pro. There are rumors about one more board (from Supermicro, I think) but that's all.
|
# ¿ Nov 14, 2018 11:33 |
|
Also, after you've done what Lambert suggests once you can usually configure the BIOS to have a 1-3 second boot delay so you have time to press delete or F2 next time, if you want.
|
# ¿ Dec 16, 2018 14:29 |
|
Tech Jesus on X570 Video version: https://www.youtube.com/watch?v=LQHMUJXhxlc
|
# ¿ Jan 18, 2019 19:37 |
|
Judging by the PC part picking thread (very anecdotal, of course), the R5 2600 is probably the most common recommendation overall. It's pretty common to try to talk people who aren't interested in overclocking and aren't going for a particularly high end GPU (and they might not even have a high refresh rate display) out of buying a 9600K. They wouldn't really benefit from it at all, and it's much more expensive. The i5-8400 used to hold this "best bang for your buck for pretty much any build under $1500", but now the 2600 is cheaper for basically the same performance (little bit worse in single-thread, quite a bit better in multi-thread). People who build their own gaming PC's are a minority, and out of that minority, people who are prepared to spend more than $1500 on their build (and/or might already have a high refresh rate monitor) is an even smaller minority. Those people are in a position to really benefit from the Intel advantage in single-thread performance, but they're probably not a huge market even though that's where all the attention is focused. TheFluff fucked around with this message at 15:31 on Jan 28, 2019 |
# ¿ Jan 28, 2019 15:21 |
|
Paul MaudDib posted:Over time, all processors approach worst-case scenario. That's what happens when you upgrade your GPU. I mean, if you're betting on GPUs staying 1070 performance forever, whatever, you do you. Yes, forums poster Paul MaudDib, 128fps rather than 142 (or 151 if you have the 2080Ti - it's not all CPU bound stuff here) definitely makes high refresh rate monitors (approximately all of which have variable refresh rate support these days) "useless". The difference in frame timing is 7.04ms vs 7.81ms, or about 10%. That's definitely Ryzen "flagging" right there. Read your own loving sources you doofus.
|
# ¿ Jan 28, 2019 16:20 |
|
Paul MaudDib posted:Oh no, you said "forums poster", I am defecated. You made an idiotic statement (regarding the uselessness of high refresh rate monitors at 128fps). Would you be happier if I called you an idiot instead? Not that I really think you're really that dumb, you're just exaggerating your arguments because you like arguing on the internet or something. Therefore, forums poster. The Ryzen 7 2700X is a slower CPU than the i9-9900K. BF V is single thread bound at high framerates like every other game ever and even an i7-7700K will get 140fps in the 1080p benchmark you linked. Nobody is arguing anything else. If you care about 1ms tighter frametimes when you're already 90% of the way to what money reasonably can buy, then yes, Intel is what you should buy. This is not controversial in any way whatsoever. Stop hurfing all this stupid durf. e: I play Rainbow Six Siege at 4K with a GTX 1080 with an i7-8700K and I get ~110fps, gently caress off with your dumbass trolling. (render resolution at 60% or something like that) TheFluff fucked around with this message at 16:41 on Jan 28, 2019 |
# ¿ Jan 28, 2019 16:38 |
|
Paul MaudDib posted:Apologies for impairing the e-honor of the 2700X. I should not have implied that game CPU requirements would increase over time, especially with consoles making a big jump in single-threaded performance as they jump to the Zen architecture. That was not very politic. Attempting to future proof gaming builds is dumb and has always been dumb.
|
# ¿ Jan 28, 2019 16:42 |
|
Paul MaudDib posted:There is no question that the 2700X is a good processor for today's games. I apologize if you perceived otherwise and were offended. I don't agree. The 2700/2700X are not particularly good value for pure gaming builds and you'll probably be better served with Intel. The 2600 is where the sweet spot is and it's good enough for pretty much everyone who's gonna spend less than $1500. Below that breakpoint the money is usually better spent on a better GPU or a better monitor. It's pointless to try to predict what kind of CPU games might need in three years from now and even more pointless to try to buy that CPU today.
|
# ¿ Jan 28, 2019 16:47 |
|
Truga posted:people kept telling me this until i bought a c2d instead of a c2q because it clocked that much higher and got 10% more fps. then a couple years later i had to replace the c2d, because two threads were wildly ineffectual while c2qs ran fine for almost a decade. Hindsight is always 20/20. Hardware performance improvements have already slowed down significantly, and I don't think it's reasonable to expect either huge single thread performance increases or CPU's with more than 8 cores for anywhere near mainstream budgets in the next few years. I don't think games that are starting development now are going to be all that much more demanding when it comes to CPU workload than the current ones. That's just me though, and I don't think you should take my advice - nor anybody else's, really - when it comes to predicting the future.
|
# ¿ Jan 28, 2019 16:56 |
|
pixaal posted:In my case, I'm the IT manager, and they are getting the same budget everyone else in design gets for their hardware so yes they will. Most places don't have someone in IT that actually understands the needs of a design department. So many people see a monitor is a monitor it doesn't matter if the color is accurate it just needs to be close enough. Which is pretty true in the business world, but if you are creating content you need the baseline to be good. At my workplace the user interface designers keep a few old lovely TN monitors around specifically in order to be able to test that the interfaces look okay on low end hardware.
|
# ¿ Mar 7, 2019 02:32 |
|
From buildzoid's Discord. Pretty obvious stuff, but still.
|
# ¿ Mar 12, 2019 20:04 |
|
Combat Pretzel posted:Is there even a reliable unified list of POST codes? No, but the motherboard manual should have a fairly comprehensive list of codes for that specific motherboard.
|
# ¿ Mar 15, 2019 19:21 |
|
Argus Monitor is a decent modern alternative to SpeedFan, and less obnoxious than most motherboard vendor fan control software.
|
# ¿ Mar 17, 2019 13:23 |
|
Statutory Ape posted:Is there laptop compatibility there? Jw No idea, never tried it on anything other than a desktop machine. e: the motherboard compatibility list does say it supports "Lenovo / IBM Thinkpad Notebooks" and "Dell Notebooks". There's a free trial, so try it I guess? TheFluff fucked around with this message at 14:18 on Mar 17, 2019 |
# ¿ Mar 17, 2019 14:15 |
|
stop watching adoredtv paul like seriously, what the gently caress is that post? thread: a- paul: *comes crashing in through the window* ZEN2 WILL NOT HIT 5.1GHz TheFluff fucked around with this message at 15:32 on Mar 27, 2019 |
# ¿ Mar 27, 2019 15:29 |
|
Palladium posted:I may be wrong here regarding Gigabyte's dual-bios but I feel its just a gimmick that creates more bugs than providing actual convenience. The variant with dual bios but no physical bios switch that switches automatically on bluescreens etc, yes, absolutely. It's terrible. The ones with a physical switch are fine.
|
# ¿ Apr 13, 2019 14:12 |
|
The wikipedia page on Cell still has a lot of that mid-naughts hype lolling about how all the talk about possible future applications just ends around 2007-2008 though
|
# ¿ Apr 18, 2019 19:41 |
|
Klyith posted:It wasn't like they were totally on an island by themselves with the Cell, remember that bulldozer had the weird off-balance 1 ALU / 2 FPU design as well. Even though bulldozer FPUs were not the restricted and difficult Cell SPEs, there was still the idea that the multimedia future was going to call for lots of FLOPS. Sure, it's just funny to me how the article still has this very optimistic tone about the future of the Cell despite the fact that it was pretty much discontinued ten years ago.
|
# ¿ Apr 19, 2019 12:06 |
|
I think the motherboard vendors are going kinda overkill on the high end boards by dimensioning them for exotic-coolant overclocking of a 16C/32T part. For a more reasonable-for-home-use 6- or 8-core on ambient cooling you should still be fine with the $100-$150 X470 boards. The new fancy PCIe stuff is expensive too of course. e: also that ^^^^^^ spasticColon posted:I'm sorry but this feels like a wet fart to me or maybe I just bought into the hype too much. Wait for the benchmarks, as always.
|
# ¿ May 27, 2019 19:32 |
|
PC LOAD LETTER posted:No, they spec'd to the 16C part AMD showed them apparently. Isn't that exactly what I said?
|
# ¿ May 27, 2019 20:03 |
|
Khorne posted:What are the prices? I somehow missed that leak. GamersNexus had pricing info for at least the Gigabyte boards in their coverage: https://www.youtube.com/watch?v=yDA0zBR6MgA (Unfortunately no text form available, video only.) stealth edit: - Gigabyte X570 Xtreme: ~$600 - Gigabyte X570 Master: ~$350 - Gigabyte X570 Ultra: ~$300 - Gigabyte X570 Pro: ~$260 ($250 without wifi) - Gigabyte X570 Elite: ~$200 - Gigabyte X570 Gaming X: ~$170 As a point of reference, comparing a few names (like, Gaming X with Gaming X) with their Z390 lineup at retail pricing right now, these boards are ~$30-$40 more expensive. The Z390 Gaming X is $140, and Z390 Xtreme is $560, for example. e2: they actually have real heatsinks with finstacks and everything for the VRM too now, so if anything you'd be able to run more CPU with less VRM with these, assuming you have some decent airflow in your case TheFluff fucked around with this message at 22:11 on May 27, 2019 |
# ¿ May 27, 2019 22:01 |
|
jisforjosh posted:Jesus what's with the ~$80 premium on the Pro version compared to the Z390 Pro at $160-190 Dunno, but the Z390 Pro is weirdly placed even within its own chipset since the Z390 Elite retails for the same price. I can't be arsed to look up what the differences between them are at the moment.
|
# ¿ May 27, 2019 22:11 |
|
jisforjosh posted:The Pro is the better board. Higher end VRM cooling solution as well as more fan headers, temperature sensors, SLI/Crossfire support, and a USB type C header. Right. Looking at Z390 VRM charts though even the Z390 Elite is considered good enough for ~200W to the socket (9900K on air cooling in extreme AVX workloads, basically), with the rest of the product stack being for custom water loops and more exotic solutions. e: For comparison my 8700K at 4.9GHz with no AVX offset draws something like 175W when running P95, and that's as high as it goes. The 9900K is basically that with two more cores, so as a rough ballpark I figure it'd land somewhere around 225W. TheFluff fucked around with this message at 22:26 on May 27, 2019 |
# ¿ May 27, 2019 22:23 |
|
Deathreaper posted:Any x570 mini itx boards with 10 Gb ethernet spotted? Really looking forward to upgrading my power hungry i7 5960x itx build... 10 gbit Ethernet is still extremely rare on consumer motherboards. As in, there are only like three boards I know of that have it at all, and the only one that is even remotely reasonably priced is the Asrock Taichi Ultimate. There's a handful of boards with 2.5 or 5 gbit, but I tried looking just now and could not find even a single consumer ITX board with anything beyond 1 gbit on the market today. Maybe this will change with PCIe 4 but it's too early to tell. TheFluff fucked around with this message at 01:01 on May 28, 2019 |
# ¿ May 28, 2019 00:57 |
|
Hob_Gadling posted:So what's the verdict: are new AMD GPUs promising for a 4K display or should I wait for one more generation before playing games again? Depending on what games you play, not even Nvidia's GPU's are particularly promising for 4K gaming. The 2080 will do 60fps in most titles but not at the highest settings. Nothing indicates the upcoming AMD cards will do better than that.
|
# ¿ Jun 2, 2019 14:24 |
|
|
# ¿ Apr 19, 2024 18:52 |
|
That's only relevant if you have a Ryzen APU (that is, onboard graphics). The 1700 doesn't have an onboard GPU, so you should be fine. See http://asrock.pc.cdn.bitgravity.com/TSD/Display%20recovery%20SOP.pdf
|
# ¿ Jun 13, 2019 14:02 |