|
Combat Pretzel posted:Someone explain the Turbo Boost stuff to me. How does the CPU decide when to cut back on the overclocking? Simply on thermals? If so, sticking a big rear end cooler on the CPU should keep it in Turbo Mode under load for practically forever? Two ways. Thermal and Voltage. Too much sustained Voltage will also cut down turbo. In my experience only happens on Y Series Processors. My I5-4300Y in my Dell venue Pro Tablet will turbo for 20 seconds and then cut back automatically due to voltage cutoff before it overheats enough to throttle due to thermals.
|
# ? Feb 11, 2016 16:40 |
|
|
# ? Mar 29, 2024 13:18 |
|
blowfish posted:sorry, it's a bit hard to remember all the companies that lost against nvidia/wintel They didn't "loose" as their cards at the time were easily neck and neck with Nvidia. They did however look appeasing to AMD which bought them out to get themselves a very good GPU business to help compete with Intel's Integrated GPU's for mobile devices. Luckily they know that keeping the Dedicated GPU business alive is good for them too, and even better they are sort of spinning off the GPU wing again as a semi separate company lol. Shoud have just left them separate as ATI but well, Corps will be Corps.
|
# ? Feb 11, 2016 17:21 |
|
movax posted:Sounds like they adjusted the BIOS's calculation / adjustment of TOLUD (Top of Lower Upper DRAM) during its resource / memory map allocation, but my real question is, why are you stuck with a 32-bit OS? Some particular piece of software? Yep, it's the software. I am spec'ing computers to run scientific instruments (some of which have hardware associated with it that is 20 years old!) that have specific hardware drivers which were never updated (and will never be updated) to support a 64-bit OS. My customers are lucky they're able to run a modern OS as it is, some of our older systems are still operating on Windows NT era PCs.
|
# ? Feb 11, 2016 19:06 |
|
Asomodai posted:Two ways. Thermal and Voltage. Too much sustained Voltage will also cut down turbo. In my experience only happens on Y Series Processors. My I5-4300Y in my Dell venue Pro Tablet will turbo for 20 seconds and then cut back automatically due to voltage cutoff before it overheats enough to throttle due to thermals.
|
# ? Feb 11, 2016 19:17 |
Combat Pretzel posted:But in a desktop CPU, given sufficient cooling, it should be able to sustain turbo mode? I have a Dark Rock 3 from beQuiet! sitting on my 5820K, which is a sufficiently big device cooler, I'd figure. It might throttle under some loads, but mostly ones that you only see in stress testing and benchmarking programs, not real world ones.
|
|
# ? Feb 11, 2016 19:25 |
|
Combat Pretzel posted:But in a desktop CPU, given sufficient cooling, it should be able to sustain turbo mode? I have a Dark Rock 3 from beQuiet! sitting on my 5820K, which is a sufficiently big device cooler, I'd figure. Turbo mode in the non throttled chips tends to be based on how many cores are in use, so one core being maxxed will give full turbo speed and the clocks go down as more and more cores are used. Some board manufactureres like asus have a multicore enhancement option in the bios which will run all the cores at max speed regardless of load on the other cores. non TDP limited intel cpus will turbo at full speed up till the 100c throttle.
|
# ? Feb 11, 2016 19:53 |
|
.
sincx fucked around with this message at 05:55 on Mar 23, 2021 |
# ? Feb 12, 2016 05:09 |
|
I've been meaning to ask this for a while. My workstation PC has a 5820K in it with a 240mm AIO liquid cooler. I built it myself and it runs great for Solidworks stuff. During rendering it clocks 4.4 at 50C. Problem is that I've been trying to get it to run at 3.6GHz default because some programs don't trigger the turbo boost and it runs slow as poo poo. After adjusting the bios to a higher minimum clock, it seems to switch between 1.2Ghz and 3.6 at a very high frequency according to Intel Power Gadget. Is this safe for the CPU? It switches frequencies several times per second and I'm worried it's damaging things. How do I get it to run at a constant bottom end speed?
|
# ? Feb 12, 2016 09:10 |
|
It's by design. The faster you can switch power states the more power you can save by optimizing the amount of time spent in a lower state.
|
# ? Feb 12, 2016 09:14 |
|
eggyolk posted:I've been meaning to ask this for a while. Also if your CPU isn't clocking up under load that sounds like you have something set up wrong, my 5820k clocks up at the slightest hint of load
|
# ? Feb 12, 2016 09:15 |
|
eggyolk posted:I've been meaning to ask this for a while. It sounds more like a software issue, the post above is helpful; check the power options in Windows. However, worrying about the cpu rapidly changing state is not necessary. It's designed for that.
|
# ? Feb 12, 2016 11:30 |
|
eggyolk posted:I've been meaning to ask this for a while. This feature is totally normal and actually has a name, SpeedStep - it's been around a while, at least ten years on mobile processors. It allows the processor to save a lot of power when it's not loaded, and definitely won't cause damage since it's just underclocking and undervolting the processor on the fly. It should stay at full speed under load though. Some motherboards have a feature to peg the processor at full speed (actually, Windows might too in advanced power options) but it doesn't really get you anything but a higher power bill unless your proc isn't staying at full speed when loaded for some reason.
|
# ? Feb 12, 2016 15:45 |
|
Shouldn't it ramp the multipliers up and down instead of switching between just low and high performances states?
|
# ? Feb 12, 2016 15:53 |
|
Eletriarnation posted:Some motherboards have a feature to peg the processor at full speed (actually, Windows might too in advanced power options) but it doesn't really get you anything but a higher power bill unless your proc isn't staying at full speed when loaded for some reason. I'd say in a desktop it's very worth the tradeoff and you get a grand total of 56 cents a month per watt difference in power bill with California's power prices. Also worth noting that you have to copy the "high performance" power profile if you choose to do it through Windows, there's a lot of invisible registry-only flags when determining speedup by load that you inherit from power saving/balanced.
|
# ? Feb 12, 2016 16:02 |
|
eggyolk posted:I've been meaning to ask this for a while. If you are running a Windows OS as well, then go to Power Options under the Control Panel / Hardware and Sound Power Options, show the additional plans if you only see Balanced and Power Saver, and select High Performance. That one should keep the CPU near top turbo clock most of the time even idle, so if you are not doing anything with it, you might want to switch back to balanced for when you aren't doing much. Another thing I recently discovered, is making a new Power Plan based off of High Performance. If you do that, then set the Minimum Processor State to 5% and Max to 100%, then based on the High Performance plan, it will still idle when its doing nothing like in balanced, but likes to shoot up to near max turbo a lot quicker than in Balanced mode. Works great for my VR stuff and Games that do the same of not pulling the CPU up to speed, yet benefit a good bit when it is there. Really weird.
|
# ? Feb 12, 2016 18:01 |
|
EdEddnEddy posted:They didn't "loose" as their cards at the time were easily neck and neck with Nvidia. Not to be that guy but you mean lose and appealing. ATI was always held back by kinda crappy drivers even when the hardware was good though
|
# ? Feb 14, 2016 13:36 |
|
Yep. My English on Friday apparently took a nosedive as it was a rough week, and Saturday didn't help make the week any better. I agree their drivers also were always a mixed bag. Usually you ended up staying with the one good one that worked with your card and all the games you were wanting to play at the time. They have gotten better, but so has Nvidia so it is a constant uphill battle for them. At least the next gen of cards should bring on some much needed competition once again, if Nvidia doesn't just curbstomp them with their new tech right out of the gate. They both have had some major time to put R&D into their new stuff with how long they have been sitting on their current tech with Fury being the only real newish tech in a long while.
|
# ? Feb 16, 2016 20:29 |
|
https://www.youtube.com/watch?v=frNjT5R5XI4 2500k vs skylake.
|
# ? Feb 21, 2016 05:40 |
|
El Scotch posted:https://www.youtube.com/watch?v=frNjT5R5XI4 The summary of this is basically "Definitely if you don't overclock, maybe if you do. Also memory bandwidth is an important factor. Also AMD can be pretty bad for DX11 games."
|
# ? Feb 21, 2016 06:17 |
|
Interesting. My buddy has a i7-3960X overclocked to 4.2GHz (Or around there). Would it be comparable to a stock Skylake i7 in most applications or would Skylake just stomp all over it?
|
# ? Feb 21, 2016 06:37 |
|
computer parts posted:The summary of this is basically "Definitely if you don't overclock, maybe if you do. Also memory bandwidth is an important factor. Also AMD can be pretty bad for DX11 games." More like "Yeah it's a little better but for the price of a CPU + mobo + RAM you're looking at 10-20% improvement at best, and sometimes near zero, depending on game and GPU." So basically wait for the next iteration unless you REALLY need those last few FPS for whatever reason, enjoy spending $500+ for incremental gains, and/or think that $500+ is worth the other bits that Skylake motherboards bring to the table.
|
# ? Feb 21, 2016 06:42 |
|
Edit read the numbers wrong, ignore this.
|
# ? Feb 21, 2016 07:10 |
|
El Scotch posted:https://www.youtube.com/watch?v=frNjT5R5XI4 They've just put up a piece analysing the 3770K, too.
|
# ? Feb 21, 2016 10:33 |
|
There's something ironically reassuring about the fact that Richard isn't a particularly slick or natural presenter - he's just a guy who really knows what he's talking about and carries the video with that.
|
# ? Feb 21, 2016 11:24 |
|
It seems like the increase in memory bandwith only matters going from 1333-1600 to 2133 and not much above. For example i can't find any tests that suggest quad channel helps with framerates. So i wonder if the upgrade to 2133 would matter if you are on a older quad channel platform with say 1600mhz ram.
|
# ? Feb 21, 2016 11:38 |
|
Well and here I thought 1600 was plenty. Probably was a couple of years ago, and I haven't bothered reading new articles until now
|
# ? Feb 21, 2016 14:40 |
|
Panty Saluter posted:Well and here I thought 1600 was plenty. Probably was a couple of years ago, and I haven't bothered reading new articles until now Surprised me too; I thought the faster memory meant bupkiss.
|
# ? Feb 21, 2016 18:06 |
|
According to Intel my CPU only supports 1600 anyway, so that's a money saver.
|
# ? Feb 21, 2016 18:20 |
|
Panty Saluter posted:According to Intel my CPU only supports 1600 anyway, so that's a money saver. If you have a -K chip on a -Z chipset, you can overclock your RAM just like anything else. You can run DDR3-however fast its stable on a 2500K, which is likely around 2133 / 2400 with modern memory.
|
# ? Feb 21, 2016 18:34 |
|
If my RAM is only rated for 1600 is it worth trying to bump it up? Or is that guaranteed no-post?
|
# ? Feb 21, 2016 18:38 |
|
Panty Saluter posted:If my RAM is only rated for 1600 is it worth trying to bump it up? Or is that guaranteed no-post? Ratings just mean "this is what we guarantee". It's quite likely you can bump it up with no issues, or you might be unlucky and get the stuff that can't go much faster.
|
# ? Feb 21, 2016 18:39 |
Panty Saluter posted:If my RAM is only rated for 1600 is it worth trying to bump it up? Or is that guaranteed no-post? Sure you can try bumping it up, the worst that happens is that it won't boot at that speed or that you get instability, you can always turn the speed and timings back to the default if you need to.
|
|
# ? Feb 21, 2016 18:41 |
|
My old 1600mhz kingston blu would run at 2000 all day
|
# ? Feb 21, 2016 19:24 |
|
Sweet, time to get that sweet 1% performance increase (maybe)
|
# ? Feb 21, 2016 19:25 |
|
Alchenar posted:There's something ironically reassuring about the fact that Richard isn't a particularly slick or natural presenter - he's just a guy who really knows what he's talking about and carries the video with that. I just wish he could pronounce his "R"s.
|
# ? Feb 21, 2016 21:21 |
|
PerrineClostermann posted:I just wish he could pronounce his "R"s. Wow, nice Amerocentric diction you troglodyte
|
# ? Feb 21, 2016 23:28 |
|
Panty Saluter posted:Wow, nice Amerocentric diction you troglodyte Twahglodyte
|
# ? Feb 22, 2016 06:31 |
|
syntaxfunction posted:Interesting. My buddy has a i7-3960X overclocked to 4.2GHz (Or around there). Would it be comparable to a stock Skylake i7 in most applications or would Skylake just stomp all over it? It moreso depends on the threading. I believe Skylake does kick the old SB-E in Single Threaded stuff at this point (might be by more when stock then overclocked, and with an X he should be able to get way more than just 4.2Ghz) but in multi-threading, he should be able to come out ahead pretty well since he has a full 6 vs 4. Now how Quad Channel DDR3 compares to Dual Channel DDR4 on a Skylake, well that's something I haven't looked at. Haswell-E has Quad DDR4 though so it's not like Skylake has a definite advantage there either.
|
# ? Feb 22, 2016 19:52 |
|
Quad channel DDR4 should have a definitive advantage over dual channel, no? That other video earlier showing that faster RAM would result in like 10ish percent of more performance, double the amount of channels should surely result in more? --edit: AIDA64 lists 42GB/s on a 5820K with quad channel CL15 DDR4-2133 vs. 30GB/s on a 6700K with dual channel CL14 DDR4-2133. --edit2: vvv Well, that guy in the video benchmarked framerates in video games and got higher ones with the faster RAM. How does that not count? Combat Pretzel fucked around with this message at 21:25 on Feb 24, 2016 |
# ? Feb 24, 2016 21:18 |
|
|
# ? Mar 29, 2024 13:18 |
|
Combat Pretzel posted:Quad channel DDR4 should have a definitive advantage over dual channel, no? In benchmarks yes. In real world rarely.
|
# ? Feb 24, 2016 21:19 |