Search Amazon.com:
Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us $3,400 per month for bandwidth bills alone, and since we don't believe in shoving popup ads to our registered users, we try to make the money back through forum registrations.
«245 »
  • Post
  • Reply
Asomodai
Jun 4, 2005

POSTING IN TFR = DONT ASK DONT TELL AM I RITE?

Combat Pretzel posted:

Someone explain the Turbo Boost stuff to me. How does the CPU decide when to cut back on the overclocking? Simply on thermals? If so, sticking a big rear end cooler on the CPU should keep it in Turbo Mode under load for practically forever?

Two ways. Thermal and Voltage. Too much sustained Voltage will also cut down turbo. In my experience only happens on Y Series Processors. My I5-4300Y in my Dell venue Pro Tablet will turbo for 20 seconds and then cut back automatically due to voltage cutoff before it overheats enough to throttle due to thermals.

Adbot
ADBOT LOVES YOU

EdEddnEddy
Apr 5, 2012



blowfish posted:

sorry, it's a bit hard to remember all the companies that lost against nvidia/wintel

They didn't "loose" as their cards at the time were easily neck and neck with Nvidia.
They did however look appeasing to AMD which bought them out to get themselves a very good GPU business to help compete with Intel's Integrated GPU's for mobile devices.
Luckily they know that keeping the Dedicated GPU business alive is good for them too, and even better they are sort of spinning off the GPU wing again as a semi separate company lol. Shoud have just left them separate as ATI but well, Corps will be Corps.

jumba
Sep 6, 2004

Hello! Press a button!

movax posted:

Sounds like they adjusted the BIOS's calculation / adjustment of TOLUD (Top of Lower Upper DRAM) during its resource / memory map allocation, but my real question is, why are you stuck with a 32-bit OS? Some particular piece of software?

Yep, it's the software. I am spec'ing computers to run scientific instruments (some of which have hardware associated with it that is 20 years old!) that have specific hardware drivers which were never updated (and will never be updated) to support a 64-bit OS. My customers are lucky they're able to run a modern OS as it is, some of our older systems are still operating on Windows NT era PCs.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Asomodai posted:

Two ways. Thermal and Voltage. Too much sustained Voltage will also cut down turbo. In my experience only happens on Y Series Processors. My I5-4300Y in my Dell venue Pro Tablet will turbo for 20 seconds and then cut back automatically due to voltage cutoff before it overheats enough to throttle due to thermals.
But in a desktop CPU, given sufficient cooling, it should be able to sustain turbo mode? I have a Dark Rock 3 from beQuiet! sitting on my 5820K, which is a sufficiently big device cooler, I'd figure.

AVeryLargeRadish
Aug 19, 2011

WolfDad is Best Dad.


Combat Pretzel posted:

But in a desktop CPU, given sufficient cooling, it should be able to sustain turbo mode? I have a Dark Rock 3 from beQuiet! sitting on my 5820K, which is a sufficiently big device cooler, I'd figure.

It might throttle under some loads, but mostly ones that you only see in stress testing and benchmarking programs, not real world ones.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.


Combat Pretzel posted:

But in a desktop CPU, given sufficient cooling, it should be able to sustain turbo mode? I have a Dark Rock 3 from beQuiet! sitting on my 5820K, which is a sufficiently big device cooler, I'd figure.

Turbo mode in the non throttled chips tends to be based on how many cores are in use, so one core being maxxed will give full turbo speed and the clocks go down as more and more cores are used. Some board manufactureres like asus have a multicore enhancement option in the bios which will run all the cores at max speed regardless of load on the other cores. non TDP limited intel cpus will turbo at full speed up till the 100c throttle.

sincx
Jul 13, 2012

What actually transpires beneath the veil of an event horizon? Decent people shouldn't think too much about that.

jumba posted:

Yep, it's the software. I am spec'ing computers to run scientific instruments (some of which have hardware associated with it that is 20 years old!) that have specific hardware drivers which were never updated (and will never be updated) to support a 64-bit OS. My customers are lucky they're able to run a modern OS as it is, some of our older systems are still operating on Windows NT era PCs.

Would VMs with Vt-d work?

eggyolk
Nov 8, 2007

NO FAT CHICKS
WOOOOOOOOO!
(so lonely)


I've been meaning to ask this for a while.
My workstation PC has a 5820K in it with a 240mm AIO liquid cooler. I built it myself and it runs great for Solidworks stuff. During rendering it clocks 4.4 at 50C.
Problem is that I've been trying to get it to run at 3.6GHz default because some programs don't trigger the turbo boost and it runs slow as poo poo. After adjusting the bios to a higher minimum clock, it seems to switch between 1.2Ghz and 3.6 at a very high frequency according to Intel Power Gadget. Is this safe for the CPU? It switches frequencies several times per second and I'm worried it's damaging things. How do I get it to run at a constant bottom end speed?

Boiled Water
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER


It's by design. The faster you can switch power states the more power you can save by optimizing the amount of time spent in a lower state.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

eggyolk posted:

I've been meaning to ask this for a while.
My workstation PC has a 5820K in it with a 240mm AIO liquid cooler. I built it myself and it runs great for Solidworks stuff. During rendering it clocks 4.4 at 50C.
Problem is that I've been trying to get it to run at 3.6GHz default because some programs don't trigger the turbo boost and it runs slow as poo poo. After adjusting the bios to a higher minimum clock, it seems to switch between 1.2Ghz and 3.6 at a very high frequency according to Intel Power Gadget. Is this safe for the CPU? It switches frequencies several times per second and I'm worried it's damaging things. How do I get it to run at a constant bottom end speed?
I don't know why you're changing BIOS settings when minimum CPU speed is an option you can change in Windows. Look in Power Options advanced settings, Processor Power Management

Also if your CPU isn't clocking up under load that sounds like you have something set up wrong, my 5820k clocks up at the slightest hint of load

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast


eggyolk posted:

I've been meaning to ask this for a while.
My workstation PC has a 5820K in it with a 240mm AIO liquid cooler. I built it myself and it runs great for Solidworks stuff. During rendering it clocks 4.4 at 50C.
Problem is that I've been trying to get it to run at 3.6GHz default because some programs don't trigger the turbo boost and it runs slow as poo poo. After adjusting the bios to a higher minimum clock, it seems to switch between 1.2Ghz and 3.6 at a very high frequency according to Intel Power Gadget. Is this safe for the CPU? It switches frequencies several times per second and I'm worried it's damaging things. How do I get it to run at a constant bottom end speed?

It sounds more like a software issue, the post above is helpful; check the power options in Windows. However, worrying about the cpu rapidly changing state is not necessary. It's designed for that.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


eggyolk posted:

I've been meaning to ask this for a while.
My workstation PC has a 5820K in it with a 240mm AIO liquid cooler. I built it myself and it runs great for Solidworks stuff. During rendering it clocks 4.4 at 50C.
Problem is that I've been trying to get it to run at 3.6GHz default because some programs don't trigger the turbo boost and it runs slow as poo poo. After adjusting the bios to a higher minimum clock, it seems to switch between 1.2Ghz and 3.6 at a very high frequency according to Intel Power Gadget. Is this safe for the CPU? It switches frequencies several times per second and I'm worried it's damaging things. How do I get it to run at a constant bottom end speed?

This feature is totally normal and actually has a name, SpeedStep - it's been around a while, at least ten years on mobile processors. It allows the processor to save a lot of power when it's not loaded, and definitely won't cause damage since it's just underclocking and undervolting the processor on the fly. It should stay at full speed under load though.

Some motherboards have a feature to peg the processor at full speed (actually, Windows might too in advanced power options) but it doesn't really get you anything but a higher power bill unless your proc isn't staying at full speed when loaded for some reason.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Shouldn't it ramp the multipliers up and down instead of switching between just low and high performances states?

Anime Schoolgirl
Nov 28, 2002

~*perfect archangel*~


Eletriarnation posted:

Some motherboards have a feature to peg the processor at full speed (actually, Windows might too in advanced power options) but it doesn't really get you anything but a higher power bill unless your proc isn't staying at full speed when loaded for some reason.
My 4790T with a constant 3.3ghz runs at 5w idle, just one watt up from idling at the minimum clock but a lot of little things (loading Photoshop, etc) became much more responsive.

I'd say in a desktop it's very worth the tradeoff and you get a grand total of 56 cents a month per watt difference in power bill with California's power prices.

Also worth noting that you have to copy the "high performance" power profile if you choose to do it through Windows, there's a lot of invisible registry-only flags when determining speedup by load that you inherit from power saving/balanced.

EdEddnEddy
Apr 5, 2012



eggyolk posted:

I've been meaning to ask this for a while.
My workstation PC has a 5820K in it with a 240mm AIO liquid cooler. I built it myself and it runs great for Solidworks stuff. During rendering it clocks 4.4 at 50C.
Problem is that I've been trying to get it to run at 3.6GHz default because some programs don't trigger the turbo boost and it runs slow as poo poo. After adjusting the bios to a higher minimum clock, it seems to switch between 1.2Ghz and 3.6 at a very high frequency according to Intel Power Gadget. Is this safe for the CPU? It switches frequencies several times per second and I'm worried it's damaging things. How do I get it to run at a constant bottom end speed?

If you are running a Windows OS as well, then go to Power Options under the Control Panel / Hardware and Sound Power Options, show the additional plans if you only see Balanced and Power Saver, and select High Performance. That one should keep the CPU near top turbo clock most of the time even idle, so if you are not doing anything with it, you might want to switch back to balanced for when you aren't doing much.


Another thing I recently discovered, is making a new Power Plan based off of High Performance. If you do that, then set the Minimum Processor State to 5% and Max to 100%, then based on the High Performance plan, it will still idle when its doing nothing like in balanced, but likes to shoot up to near max turbo a lot quicker than in Balanced mode. Works great for my VR stuff and Games that do the same of not pulling the CPU up to speed, yet benefit a good bit when it is there. Really weird.

feedmegin
Jul 30, 2008



EdEddnEddy posted:

They didn't "loose" as their cards at the time were easily neck and neck with Nvidia.
They did however look appeasing to AMD which bought them out to get themselves a very good GPU business to help compete with Intel's Integrated GPU's for mobile devices.
Luckily they know that keeping the Dedicated GPU business alive is good for them too, and even better they are sort of spinning off the GPU wing again as a semi separate company lol. Shoud have just left them separate as ATI but well, Corps will be Corps.

Not to be that guy but you mean lose and appealing.

ATI was always held back by kinda crappy drivers even when the hardware was good though

EdEddnEddy
Apr 5, 2012



Yep. My English on Friday apparently took a nosedive as it was a rough week, and Saturday didn't help make the week any better.

I agree their drivers also were always a mixed bag. Usually you ended up staying with the one good one that worked with your card and all the games you were wanting to play at the time. They have gotten better, but so has Nvidia so it is a constant uphill battle for them.

At least the next gen of cards should bring on some much needed competition once again, if Nvidia doesn't just curbstomp them with their new tech right out of the gate. They both have had some major time to put R&D into their new stuff with how long they have been sitting on their current tech with Fury being the only real newish tech in a long while.

El Scotch
Aug 25, 2009



https://www.youtube.com/watch?v=frNjT5R5XI4

2500k vs skylake.

computer parts
Nov 18, 2010

PLEASE CLAP


The summary of this is basically "Definitely if you don't overclock, maybe if you do. Also memory bandwidth is an important factor. Also AMD can be pretty bad for DX11 games."

syntaxfunction
Oct 27, 2010


Interesting. My buddy has a i7-3960X overclocked to 4.2GHz (Or around there). Would it be comparable to a stock Skylake i7 in most applications or would Skylake just stomp all over it?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

computer parts posted:

The summary of this is basically "Definitely if you don't overclock, maybe if you do. Also memory bandwidth is an important factor. Also AMD can be pretty bad for DX11 games."

More like "Yeah it's a little better but for the price of a CPU + mobo + RAM you're looking at 10-20% improvement at best, and sometimes near zero, depending on game and GPU." So basically wait for the next iteration unless you REALLY need those last few FPS for whatever reason, enjoy spending $500+ for incremental gains, and/or think that $500+ is worth the other bits that Skylake motherboards bring to the table.

Lovable Luciferian
Jul 10, 2007

Flashing my onyx masonic ring at 5 cent wing n trivia night at Dinglers Sports Bar - Ozma


Edit read the numbers wrong, ignore this.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast



They've just put up a piece analysing the 3770K, too.

Alchenar
Apr 9, 2008

The level of betrayal I felt when Paradox announced their new wallpaper tore something from me that I'll never be able to recover. They tore away my ability to respect anything, and they tore away my ability to feel human.


There's something ironically reassuring about the fact that Richard isn't a particularly slick or natural presenter - he's just a guy who really knows what he's talking about and carries the video with that.

NihilismNow
Aug 31, 2003


It seems like the increase in memory bandwith only matters going from 1333-1600 to 2133 and not much above. For example i can't find any tests that suggest quad channel helps with framerates. So i wonder if the upgrade to 2133 would matter if you are on a older quad channel platform with say 1600mhz ram.

Panty Saluter
Jan 17, 2004



Well and here I thought 1600 was plenty. Probably was a couple of years ago, and I haven't bothered reading new articles until now

El Scotch
Aug 25, 2009



Panty Saluter posted:

Well and here I thought 1600 was plenty. Probably was a couple of years ago, and I haven't bothered reading new articles until now

Surprised me too; I thought the faster memory meant bupkiss.

Panty Saluter
Jan 17, 2004



According to Intel my CPU only supports 1600 anyway, so that's a money saver.

Twerk from Home
Jan 17, 2009


Panty Saluter posted:

According to Intel my CPU only supports 1600 anyway, so that's a money saver.

If you have a -K chip on a -Z chipset, you can overclock your RAM just like anything else. You can run DDR3-however fast its stable on a 2500K, which is likely around 2133 / 2400 with modern memory.

Panty Saluter
Jan 17, 2004



If my RAM is only rated for 1600 is it worth trying to bump it up? Or is that guaranteed no-post?

computer parts
Nov 18, 2010

PLEASE CLAP

Panty Saluter posted:

If my RAM is only rated for 1600 is it worth trying to bump it up? Or is that guaranteed no-post?

Ratings just mean "this is what we guarantee". It's quite likely you can bump it up with no issues, or you might be unlucky and get the stuff that can't go much faster.

AVeryLargeRadish
Aug 19, 2011

WolfDad is Best Dad.


Panty Saluter posted:

If my RAM is only rated for 1600 is it worth trying to bump it up? Or is that guaranteed no-post?

Sure you can try bumping it up, the worst that happens is that it won't boot at that speed or that you get instability, you can always turn the speed and timings back to the default if you need to.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.


My old 1600mhz kingston blu would run at 2000 all day

Panty Saluter
Jan 17, 2004



Sweet, time to get that sweet 1% performance increase (maybe)

PerrineClostermann
Dec 15, 2012


Let's build a great, great wall!


Alchenar posted:

There's something ironically reassuring about the fact that Richard isn't a particularly slick or natural presenter - he's just a guy who really knows what he's talking about and carries the video with that.

I just wish he could pronounce his "R"s.

Panty Saluter
Jan 17, 2004



PerrineClostermann posted:

I just wish he could pronounce his "R"s.

Wow, nice Amerocentric diction you troglodyte

PerrineClostermann
Dec 15, 2012


Let's build a great, great wall!


Panty Saluter posted:

Wow, nice Amerocentric diction you troglodyte

Twahglodyte

EdEddnEddy
Apr 5, 2012



syntaxfunction posted:

Interesting. My buddy has a i7-3960X overclocked to 4.2GHz (Or around there). Would it be comparable to a stock Skylake i7 in most applications or would Skylake just stomp all over it?

It moreso depends on the threading. I believe Skylake does kick the old SB-E in Single Threaded stuff at this point (might be by more when stock then overclocked, and with an X he should be able to get way more than just 4.2Ghz) but in multi-threading, he should be able to come out ahead pretty well since he has a full 6 vs 4.

Now how Quad Channel DDR3 compares to Dual Channel DDR4 on a Skylake, well that's something I haven't looked at. Haswell-E has Quad DDR4 though so it's not like Skylake has a definite advantage there either.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Quad channel DDR4 should have a definitive advantage over dual channel, no? That other video earlier showing that faster RAM would result in like 10ish percent of more performance, double the amount of channels should surely result in more?

--edit:
AIDA64 lists 42GB/s on a 5820K with quad channel CL15 DDR4-2133 vs. 30GB/s on a 6700K with dual channel CL14 DDR4-2133. Of course, this doesn't tell me if latency improved. Nevermind, there's a latency section, and the 6700K has lower.

--edit2:
vvv Well, that guy in the video benchmarked framerates in video games and got higher ones with the faster RAM. How does that not count?

Combat Pretzel fucked around with this message at Feb 24, 2016 around 20:25

Adbot
ADBOT LOVES YOU

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.


Combat Pretzel posted:

Quad channel DDR4 should have a definitive advantage over dual channel, no?

In benchmarks yes. In real world rarely.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply
«245 »