Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

MrYenko posted:

For me, moving from Nehalem to Skylake isn't even about performance, its just to get out of my ancient X58 chipset motherboard, and even that isn't because of speed concerns, but because the thing is flat out old. Dead USB ports, it hasn't had a functioning onboard network interface in years, and I really feel like it's the weak point of my machine, currently.

Also; I like new stuff. :v:

Don't forget the fact that it's some horrible mix of PCIe 1.1 and 2.0a, and you have all both slots available after the video card.

Adbot
ADBOT LOVES YOU

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
Yeah, PCI-e upgrades are one thing that's happened over several years that's possibly worthwhile to upgrade for, and trying to avoid a lot of the PCI-e lane splitting BS is another reason I opted for a server platform in 2011. Then there's SATA2 / SATA3 ports to consider since newer SSDs are capped under SATA2 (and guess who bought a top-of-the-line SATA SSD without thinking of this? Me, of course). M.2 slots are going to eclipse all of these as well, so at least I'll be able to use those super fast SSDs coming out later this year... when they're on sale of course.

Another thing about server motherboards is that most don't have sound cards (rightfully so). I felt really silly when I wound up buying a sound card in 2014 because I simply didn't have one and realized I needed one all of a sudden. Putting my Xeon into service as a dedicated server is just feeling good at this point for me.

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

necrobobsledder posted:

Yeah, PCI-e upgrades are one thing that's happened over several years that's possibly worthwhile to upgrade for, and trying to avoid a lot of the PCI-e lane splitting BS is another reason I opted for a server platform in 2011. Then there's SATA2 / SATA3 ports to consider since newer SSDs are capped under SATA2 (and guess who bought a top-of-the-line SATA SSD without thinking of this? Me, of course). M.2 slots are going to eclipse all of these as well, so at least I'll be able to use those super fast SSDs coming out later this year... when they're on sale of course.

Another thing about server motherboards is that most don't have sound cards (rightfully so). I felt really silly when I wound up buying a sound card in 2014 because I simply didn't have one and realized I needed one all of a sudden. Putting my Xeon into service as a dedicated server is just feeling good at this point for me.

I have yet to find onboard sound that does not add distortion. I currently use an awesome Asus card that cost me like $20 and has a SPDIF out.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

mayodreams posted:

I have yet to find onboard sound that does not add distortion. I currently use an awesome Asus card that cost me like $20 and has a SPDIF out.

The little dedicated card that fits onto my Maximus VII Gene sounds pretty good playing through my Marantz.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

mayodreams posted:

I have yet to find onboard sound that does not add distortion. I currently use an awesome Asus card that cost me like $20 and has a SPDIF out.
Well I'm on a mini ITX board and that leaves me screwed for PCI-e slots or I should be using Toslink to an external mixer and amp. Or... I can just not care, use USB speakers with amps onboard, and move on. I'm still on the hunt for a solid value mixer and amp that can take USB audio while giving me some options for my analog headphones and mic (I bought a ModMic for flexibility and because it's a directional mic and has successfully blocked out my cat that randomly screams while I'm on a conference call).

I'm not sure what kind of noise I'll get when using toslink for audio out, but I get the impression that the processing will all happen onboard and still have all the board interference noise (and especially jitter) when converting to the signal for toslink optical. I'm not exactly doing audio recording but it's really annoying hearing random noises when I start up a high CPU task. I used to get hard drive whining noises on an older Cirrus Logic chipset card from Turtle Beach a decade ago (all that trouble to avoid having to use the Linux software mixer) that I didn't seem to get on Windows, so there may be a software component involved that I'm not aware of (aside from the ASIO stuff that is).

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

atomicthumbs posted:

The little dedicated card that fits onto my Maximus VII Gene sounds pretty good playing through my Marantz.

I've been curious about that. I'm still on a Z68 board, but I'm sure things have improved some. I am sending SPDIF to some Edirol Studio Monitors which are just amazing.

I've had issues in the pas with USB DACs getting bogged down with a high CPU or audio load. Like playing a simple game and having music on too.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.
How do I get Quick Sync Video to work on my 4790K? I've made sure the onboard GPU is enabled and I installed the Intel drivers, but nothing that uses QSV seems to work.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
You dinguses talking motherboard audio should go to the audiophile ridicule thread. S/PDIF is a digital connector. There's no loving "jitter," all the processing is through the Windows software sound mixer and comes out bit-perfect identical whether it's onboard S/PDIF, split off HDMI, USB to S/PDIF audio interface, or a $500 internal sound card's S/PDIF interface. Digital. The output is 1 or 0. It can't jitter. The digital interfaces on the card completely skip the DAC and any component which can produce any noise of any variety.

The only thing that the sound card and driver can do is up-mixing from stereo to surround (or 5.1 to 7.1 etc.) for content that isn't natively multi-channel.

atomicthumbs posted:

How do I get Quick Sync Video to work on my 4790K? I've made sure the onboard GPU is enabled and I installed the Intel drivers, but nothing that uses QSV seems to work.

In the BIOS/UEFI setup, is iGPU Multi-Monitor set to "Enabled?" If it already is and it's still not working, try hooking up a monitor cable (like hook a DVI to another input on your screen or whatnot). No need to actually put an image on screen via the Windows multi-monitor settings, but it used to be a requirement for enabling QuickSync and might be worth trying as a troubleshooting step.

Alternatively, if you have an Nvidia video card, just use NVEnc - it's pretty good, too.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

necrobobsledder posted:

I'm not sure what kind of noise I'll get when using toslink for audio out, but I get the impression that the processing will all happen onboard and still have all the board interference noise (and especially jitter) when converting to the signal for toslink optical. I'm not exactly doing audio recording but it's really annoying hearing random noises when I start up a high CPU task.

what F^2 said, jitter is a real phenomenon but it isn't very significant for digital audio in 99% of cases.

But really I'm posting this because those random noises you hear when starting up a high CPU task? May not be what you think. CPUs get their power from switch-mode regulators. Switchers have magnetic coil inductors, which can mechanically flex a teeny little bit as the electric current flowing through them changes. And that current is changing; modern CPUs tend to rapidly switch between lots of current and none when they're not fully loaded. These switching events may sometimes happen at audible frequencies...

In other words sometimes it is possible to hear noises emanating from components on your motherboard rather than your speakers, if the room's quiet and there's nothing else masking the sound.

is that good
Apr 14, 2012
My fairly cheap dac gets crackling sounds through toslink input, but it has a usb input option which works fine. Granted I'm on a super cheapo Asrock board from a while ago. The processing happens on your dac either way. Toslink is going to give you a line-level signal. (Some dac manufacturers specifically call out that some toslink transmitters don't work too well at 24/192, which is probably what you're shooting for, but you should probably be fine.)

LRADIKAL
Jun 10, 2001

Fun Shoe

BobHoward posted:

what F^2 said, jitter is a real phenomenon but it isn't very significant for digital audio in 99% of cases.

But really I'm posting this because those random noises you hear when starting up a high CPU task? May not be what you think. CPUs get their power from switch-mode regulators. Switchers have magnetic coil inductors, which can mechanically flex a teeny little bit as the electric current flowing through them changes. And that current is changing; modern CPUs tend to rapidly switch between lots of current and none when they're not fully loaded. These switching events may sometimes happen at audible frequencies...

In other words sometimes it is possible to hear noises emanating from components on your motherboard rather than your speakers, if the room's quiet and there's nothing else masking the sound.

https://www.youtube.com/watch?v=HP73edpQwgc
Certain nVidia 970's were doing that pretty bad.

edit: oh yeah, lol'ing at the jitter worry.

GRINDCORE MEGGIDO
Feb 28, 1985


Dammit, looks like my Z77 board is getting flaky, and not detecting graphics cards 100% of the time :(

Normally I'd just replace the old board and chip with something newer and sell the chip, but kinda attached to this 2500K and not seeing much benefit to going Haswell.
This might be the first time I ever replace an old motherboard with something of similar spec.

Darkpriest667
Feb 2, 2015

I'm sorry I impugned
your cocksmanship.

wipeout posted:

Dammit, looks like my Z77 board is getting flaky, and not detecting graphics cards 100% of the time :(

Normally I'd just replace the old board and chip with something newer and sell the chip, but kinda attached to this 2500K and not seeing much benefit to going Haswell.
This might be the first time I ever replace an old motherboard with something of similar spec.


What model do you have? I am still on a p67 board running strong for 4 years now!

GRINDCORE MEGGIDO
Feb 28, 1985


Darkpriest667 posted:

What model do you have? I am still on a p67 board running strong for 4 years now!

This board is an Asrock Z77e-ITX - it just intermittently decides there is nothing in the PCI-e slot. Cleared CMOS, tried another graphics card (one nVidia, one AMD), reseated them, updated BIOS's and tried betas. Same deal :(

(Bugger is, I can find a lot of cheapish ATX S1155 boards, but most of the mITX ones are overpriced now. Was hoping to keep this alive until there was something really compelling to upgrade to)

SamDabbers
May 26, 2003



wipeout posted:

(Bugger is, I can find a lot of cheapish ATX S1155 boards, but most of the mITX ones are overpriced now. Was hoping to keep this alive until there was something really compelling to upgrade to)

Amazon has brand new ASUS P8Z77-I DELUXE/WDs for $143 after a $50 rebate. Seems like a solid deal all things considered.

GRINDCORE MEGGIDO
Feb 28, 1985


SamDabbers posted:

Amazon has brand new ASUS P8Z77-I DELUXE/WDs for $143 after a $50 rebate. Seems like a solid deal all things considered.

That's pretty awesome and I appreciate it. I'll have to see what I can find in the uk, though.

GRINDCORE MEGGIDO fucked around with this message at 20:02 on Feb 13, 2015

Welmu
Oct 9, 2007
Metri. Piiri. Sekunti.

BobHoward posted:

In other words sometimes it is possible to hear noises emanating from components on your motherboard rather than your speakers, if the room's quiet and there's nothing else masking the sound.

It's also possible to analyze this noise and use it to extract RSA keys.


In news, Skylake Core-M coming later this year according to Krzanich.

Haswell-EX specs:

SCheeseman
Apr 23, 2003

wipeout posted:

Dammit, looks like my Z77 board is getting flaky, and not detecting graphics cards 100% of the time :(

Normally I'd just replace the old board and chip with something newer and sell the chip, but kinda attached to this 2500K and not seeing much benefit to going Haswell.
This might be the first time I ever replace an old motherboard with something of similar spec.

Get a can of compressed air or something and blow out the connector slot. I had a similar issue with my board and it turned out to be dust in the PCIE slot.

BurritoJustice
Oct 9, 2012

Welmu posted:

It's also possible to analyze this noise and use it to extract RSA keys.


In news, Skylake Core-M coming later this year according to Krzanich.

Haswell-EX specs:


Can anyone explain to me why the "top" (at least model number wise) part is a 4 core 140w part. Seems a bit insane, the frequency difference is relatively minor to the 18 core part.

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

BurritoJustice posted:

Can anyone explain to me why the "top" (at least model number wise) part is a 4 core 140w part. Seems a bit insane, the frequency difference is relatively minor to the 18 core part.

A 22% increase in clock speed is not a minor difference. Some workloads are not heavily threaded, so clock speed becomes king, and a 140W TDP 4 core proc will get you better results than an 18 core with 25W more TDP that will throttle cores to maintain that TDP max.

sauer kraut
Oct 2, 2004
Welp RIP Broadwell desktop according to the rumor mill.
Your little 14nm lanes were burning too bright for this world.

GRINDCORE MEGGIDO
Feb 28, 1985


SwissCM posted:

Get a can of compressed air or something and blow out the connector slot. I had a similar issue with my board and it turned out to be dust in the PCIE slot.

Air duster ordered, you rock!

Darkpriest667
Feb 2, 2015

I'm sorry I impugned
your cocksmanship.

wipeout posted:

This board is an Asrock Z77e-ITX - it just intermittently decides there is nothing in the PCI-e slot. Cleared CMOS, tried another graphics card (one nVidia, one AMD), reseated them, updated BIOS's and tried betas. Same deal :(

(Bugger is, I can find a lot of cheapish ATX S1155 boards, but most of the mITX ones are overpriced now. Was hoping to keep this alive until there was something really compelling to upgrade to)

Bummer I have a Fatal1ty Pro board , also by ASrock, I swear by the Fatal1ty Professional series I've owned 3 of them with different chipsets and they have always been real good to me. I like ASrock as an alternative to ASUS and Gigabyte.

Rime
Nov 2, 2011

by Games Forum

sauer kraut posted:

Welp RIP Broadwell desktop according to the rumor mill.
Your little 14nm lanes were burning too bright for this world.

:negative:

BurritoJustice
Oct 9, 2012

mayodreams posted:

A 22% increase in clock speed is not a minor difference. Some workloads are not heavily threaded, so clock speed becomes king, and a 140W TDP 4 core proc will get you better results than an 18 core with 25W more TDP that will throttle cores to maintain that TDP max.

But Intel has 4.4GHz 4 cores in 88w TDPs, and the only difference I can see really is the extra cache and QPI links which can't make up the difference. Noting the clock speeds I should've compared to the 10 core. 6 more cores and 12.5% slower clocks for only a 25w delta?

doomisland
Oct 5, 2004

mayodreams posted:

A 22% increase in clock speed is not a minor difference. Some workloads are not heavily threaded, so clock speed becomes king, and a 140W TDP 4 core proc will get you better results than an 18 core with 25W more TDP that will throttle cores to maintain that TDP max.

Also you can put 8 of them in a system :D

isndl
May 2, 2012
I WON A CONTEST IN TG AND ALL I GOT WAS THIS CUSTOM TITLE

BurritoJustice posted:

But Intel has 4.4GHz 4 cores in 88w TDPs, and the only difference I can see really is the extra cache and QPI links which can't make up the difference. Noting the clock speeds I should've compared to the 10 core. 6 more cores and 12.5% slower clocks for only a 25w delta?

Those might be partially defective chips that are salvageable by pumping massive amounts of power through them. Not ideal but better than simply throwing them away if their yields are struggling.

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

isndl posted:

Those might be partially defective chips that are salvageable by pumping massive amounts of power through them. Not ideal but better than simply throwing them away if their yields are struggling.

Depending on how they fuse the chips, it could also be a frequency/voltage/TDP tradeoff, where marginal chips with a really leaky set of cores get fused off and they just allocate the TDP on the other half to bumping the frequency/voltage up to compensate. 10 Cores with a 160W TDP due to a flaky subset of cores turns into a 4670K with ECC ram.

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

BurritoJustice posted:

Can anyone explain to me why the "top" (at least model number wise) part is a 4 core 140w part. Seems a bit insane, the frequency difference is relatively minor to the 18 core part.

That part exists for software that is licensed per-core that needs the larger memory capacity of the EX platform.

BurritoJustice posted:

But Intel has 4.4GHz 4 cores in 88w TDPs, and the only difference I can see really is the extra cache and QPI links which can't make up the difference. Noting the clock speeds I should've compared to the 10 core. 6 more cores and 12.5% slower clocks for only a 25w delta?

Whole lot more transistors on the E7, Base vs Turbo, more stringent validation, market positioning. This should clear things up:

Here's a picture of a Haswell i7:


Here's a picture of the active* components of a 4-core, 40 MB E7:


Note that the cores are only a fraction of the overall die.

I am somewhat surprised that the 8883v3 is slower than the 8883v2, though.

* this is a Ivy Bridge EX, so the DDR4 controllers will (probably) be larger on the Haswell-EX. Same 22 nm process, though. Some portion of the blanked-out areas would still be active.

GRINDCORE MEGGIDO
Feb 28, 1985


Darkpriest667 posted:

Bummer I have a Fatal1ty Pro board , also by ASrock, I swear by the Fatal1ty Professional series I've owned 3 of them with different chipsets and they have always been real good to me. I like ASrock as an alternative to ASUS and Gigabyte.

Ya, this little Itx board has been awesome, fingers crossed it is just some crud in the slot. I'd buy Asrock again no question.

Hoping Broadwell socketed chips are actually going to appear - aren't they the last procs z97 will support?

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Methylethylaldehyde posted:

Depending on how they fuse the chips, it could also be a frequency/voltage/TDP tradeoff, where marginal chips with a really leaky set of cores get fused off and they just allocate the TDP on the other half to bumping the frequency/voltage up to compensate. 10 Cores with a 160W TDP due to a flaky subset of cores turns into a 4670K with ECC ram.

Although fusing is a thing, there are very different dies involved here (see PCjr's pic) and they do not ever turn one of these big monster parts into a 4670k.

Intel's current pattern is to do (at least) three groups of designs (ignoring everything for mobile):

Xeon E7
Xeon E5 / LGA20xx enthusiast desktop
Xeon E3 / LGA11xx desktop

There is no crossover based on fusing between the three groups because they are very different designs. And the E7 is really different. They're targeted at very large and very high reliability servers, where cost is not a problem and exotic features are common. For example, E7 doesn't have standard DDR memory interfaces on-chip, instead it speaks a different protocol to arrays of external buffer/controller chips, allowing it to support truly fuckoff quantities of memory. (Probably multiple terabytes for the new E7 v3 family. Looks like E7 v2 already supported up to 1.5TB.)

Intel doesn't have a great need to play fusing games in order to recover poorly performing or partially defective E7 chips. The prices are so high ($5K or more per chip is not out of the question) that they can easily afford low yield.

E7 is also held to much more stringent reliability requirements. They're very conservative with binning. That, plus the extra power used by RAS and scalability features, is why you end up with much lower top-bin frequencies than the non-behemoth product lines.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

sauer kraut posted:

Welp RIP Broadwell desktop according to the rumor mill.
Your little 14nm lanes were burning too bright for this world.
Good riddance, I guess? It never made sense to release Broadwell, when Skylake was right up its heels.

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

BobHoward posted:

E7 is also held to much more stringent reliability requirements. They're very conservative with binning. That, plus the extra power used by RAS and scalability features, is why you end up with much lower top-bin frequencies than the non-behemoth product lines.

Yeah, I probably picked a lovely part to compare it to, it was the first 'stupid fast. 4 cores' thing I could think of. It's still probably TDP budgeting, disable the leaky cores and rebin it as a different SKU. That improves yields and allows for a sort of natural market segmentation effect without needing to actually produce more than one set of silicon masks.

They can certainly afford to be super picky about which parts end up in the 5k CPU bin, but even partial recovery on a high cost item is better than no recovery.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
Looks like there's enough demand for those 2TB+ RAM graph processing nodes that LinkedIn and some others wanted from OEMs that Intel decided to chip in with that kind of memory hardware. Neat.

Fajita Fiesta
Dec 15, 2013
Can anyone explain why twitch streamers keep buying Xenon CPUs for encoding x264 for streaming when consumer grade CPUs murder them in performance? http://www.anandtech.com/bench/CPU/53

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
If you look at the x264 2 pass, the xeons murder the dekstop cpus.

Fajita Fiesta
Dec 15, 2013
Live streaming x264 presets are strictly 1 pass I thought.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I thought the point of the Xeons was that you can get more than four cores, so you can encode and stream multiple things in parallel, or stream and play on the same box without much of an impact.

Nintendo Kid
Aug 4, 2011

by Smythe
Honestly I thought they were mostly buying Xeons because it lets them set higher asking points for donations.

Adbot
ADBOT LOVES YOU

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

BobHoward posted:

(Probably multiple terabytes for the new E7 v3 family. Looks like E7 v2 already supported up to 1.5TB.)

1.5 TB per socket :)

necrobobsledder posted:

Looks like there's enough demand for those 2TB+ RAM graph processing nodes that LinkedIn and some others wanted from OEMs that Intel decided to chip in with that kind of memory hardware. Neat.

You could get a 4 socket, 2TB node from IBM in 2011 on the E7v1s. You see these a in a lot of large database boxes; Intel has made a lot of money off of Oracle for a while now. They'll even do stuff like provide Oracle-exclusive SKUs; $300K (list) will get you a 120 core, 6 TB system from Lenovo.

Fajita Fiesta posted:

Can anyone explain why twitch streamers keep buying Xenon CPUs for encoding x264 for streaming when consumer grade CPUs murder them in performance? http://www.anandtech.com/bench/CPU/53

I don't know what Xeons they're buying, but a WAG would be that encoding and gaming at the same time probably benefits from the extra cores over the consumer i7s that is not reflected in the synthetic encode-only benchmark.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply