Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
cliffy
Apr 12, 2002

Anyone ever had a CPU go bad? My 5820k started causing weird full system hardlocks and crashes. Took me a while to narrow it down, but after swapping it out a couple of times its pretty clear the processor was the culprit.

Apparently something to watch out for on 5820k processors at least. I've found reviews on Newegg mentioning similar troubles. Worked for about a year before starting to die. Had adequate cooling and no crazy overclocking.

Adbot
ADBOT LOVES YOU

Col.Kiwi
Dec 28, 2004
And the grave digger puts on the forceps...
Yes I have seen it a few times, where we confirmed that replacing the CPU resolved the issue and going back to the bad CPU brought back the issue. It is vastly less common than motherboard failure or other failures which is why nobody ever expects it, but it happens.

redeyes
Sep 14, 2002

by Fluffdaddy
I haven't personally seen this but I read that in these cases the CPU pads might be corroded. You might be able to clean the pads with some electrical contact cleaner. It makes some sense give the VRMs are located inside the CPU. In the Skylake generation they went back to mobo VRMs.

EdEddnEddy
Apr 5, 2012



A 5820K should still be under warranty shouldn't it?

Intel is pretty good about replacing it as long as you can detail the troubleshooting steps pretty clearly and weren't just abusing the hell out of it at 1.5v+ or something.

I had a similar issue with an i7 920 back in the day and even at stock speeds it was crashing similar to the way you mentioned. Got on a chat with Intel and they swapped it out with a perfectly new i7 920 (the best revision of it as well at the time). That chip is still kicking like a champ ever since.

cliffy
Apr 12, 2002

EdEddnEddy posted:

A 5820K should still be under warranty shouldn't it?

Intel is pretty good about replacing it as long as you can detail the troubleshooting steps pretty clearly and weren't just abusing the hell out of it at 1.5v+ or something.

I had a similar issue with an i7 920 back in the day and even at stock speeds it was crashing similar to the way you mentioned. Got on a chat with Intel and they swapped it out with a perfectly new i7 920 (the best revision of it as well at the time). That chip is still kicking like a champ ever since.

Yes, it's under warranty and I've already received a replacement from Intel. Just really surprising to have a CPU go bad is all.

EdEddnEddy
Apr 5, 2012



Yea it really isn't very common these days, but it can happen.

I had thought I ran into that on my i3-2100 system I won at Nvidia Lan, however it turned out to be the Patriot DDR3 memory in the system as the issue followed me over to a replacement Motherboard/CPU build. (The errors I got kept saying that CPU 1 error and stuff, but all Memtest came back fine.)

Patriot recently came through on that Lifetime warranty too. I sent them the bad 4GB (2X2G) kit, and they sent me a same speed, but 8G (2X4G kit). :)

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

apropos man posted:

I didn't realise that Linux being good at mutli-threading was a thing. I guess it depends on what packages you're using to interface with the kernel? In that case I look forward to having a go with full disk encryption. Quite excited about this £100 upgrade :)

PBCrunch posted:

I don't know if its accurate to say Linux is good at taking advantage of threads; the types of tasks people accomplish using Linux tend to lend themselves to multiple threads.

A Linux machine is far more likely to serve files or run a database than to run an office suite play a game that requires max out single thread performance.

What I was getting at is more that the Unix philosophy of "do one thing well" tends to lend itself very nicely to threading. Instead of monolithic applications you tend to get applications that consist of multiple processes strung together, both at an application level (eg via pipes) and an OS level. You still can't defeat a true I/O bottleneck or Amdahl's law, but monolithic applications can often make things sequential that don't have to be and by separating them out you can get more cores into effective use.

It's certainly not true of everything of course, but one classic example is Adam Drake's article Command-line tools can be 235x faster than your Hadoop cluster.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

redeyes posted:

I haven't personally seen this but I read that in these cases the CPU pads might be corroded. You might be able to clean the pads with some electrical contact cleaner. It makes some sense give the VRMs are located inside the CPU. In the Skylake generation they went back to mobo VRMs.

I have to know, what on earth does the placement of VRMs have to do with corrosion. Also if the lands are corroded you have been doing some serious hazmat poo poo to your computer because that poo poo is gold plated.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Paul MaudDib posted:

What I was getting at is more that the Unix philosophy of "do one thing well" tends to lend itself very nicely to threading. Instead of monolithic applications you tend to get applications that consist of multiple processes strung together, both at an application level (eg via pipes) and an OS level.

Oh hell no

I would love to know of any actual examples of professionally written compute intensive Unix applications which are broken down into multiple tiny do-one-thing-well processes strung together with pipes, because I would find that amazing. It is a painful model for writing anything serious and it is not efficient either (piping poo poo around isn't free). The main reason "do one thing well" is popular is that it's a great tool for rapidly throwing together programs when you don't care much about maintainability or performance. The downside is that it's a great tool for rapidly etc. (because the rapidity tempts people into going to that well too often, and soon you have a lovecraftian nightmare of shell script that has to be maintained).

quote:

You still can't defeat a true I/O bottleneck or Amdahl's law, but monolithic applications can often make things sequential that don't have to be and by separating them out you can get more cores into effective use.

It's certainly not true of everything of course, but one classic example is Adam Drake's article Command-line tools can be 235x faster than your Hadoop cluster.

That is not the example you think it is. Hadoop's entire reason to exist is massive parallelism... at the cluster level. Using it for the described job size is like firing up one of those monstrous 40 foot tall mining ore haulers to take a 1 mile trip to the post office instead of riding your bike. Bike dude gets back and has a frosty drink in hand before the guy using the ore hauler has completed its engine start procedure. Drake is reminding people that overhead matters.

(also wouldn't be surprised if the Hadoop newbie Drake linked to made horrible optimization mistakes. Drake massively improved the performance of his own hack with a few simple tweaks, and that's the kind of thing you don't know to do when you're new to a technology.)

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Well this is interesting: https://www.techpowerup.com/226306/msi-announces-bios-updates-for-kaby-lake-7th-gen-core-processors

Nice to know the Kaby Lakes don't need a 200-series board.

Pryor on Fire
May 14, 2013

they don't know all alien abduction experiences can be explained by people thinking saving private ryan was a documentary

Yeah it's kind of an open secret that multithreading is (still) difficult and even people and systems that are really really good at it can't scale out well, if at all, and they are often much more error prone. I think this is completely independent of the unix do one thing and spit text both ways philosophy, I mean I guess since unix is this confusing clusterfuck of 7561523 tools and processes that gets parallelized by the OS to some degree, but nobody really cares about OS performance anymore.

I'd imagine if machine learning takes off in a big way then we'll see some insane benefits to increased threads/core capacity that we can't even understand. Maybe that's actually the answer, instead of some massive complicated shitshow like threading in Java we can all just twiddle our thumbs until the machines figure it out for us.

Anime Schoolgirl
Nov 28, 2002

BIG HEADLINE posted:

Well this is interesting: https://www.techpowerup.com/226306/msi-announces-bios-updates-for-kaby-lake-7th-gen-core-processors

Nice to know the Kaby Lakes don't need a 200-series board.
They're increasing the generation number up by one when it's really just devil's canyon all over again

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!
Kaby Lake promises less CPU strain playing 4K videos. Is this something that would be delegated to a modern dedicated graphics card (like Nvidia's 10XX series) if you have it? Or is this a useful feature whether you go with built-in graphics as well as for users with dedicated graphic cards?

e: \/ Thank you, thank you.

lllllllllllllllllll fucked around with this message at 22:01 on Oct 2, 2016

apropos man
Sep 5, 2016

You get a hundred and forty one thousand years and you're out in eight!

lllllllllllllllllll posted:

Kaby Lake promises less CPU strain playing 4K videos. Is this something that would be delegated to a modern dedicated graphics card (like Nvidia's 10XX series) if you have it? Or is this a useful feature whether you go with built-in graphics as well as for users with dedicated graphic cards?

Possibly whatever features they've pipelined into the chips that deals with high bitrate video will have great implications for mobile computing. Even if a laptop only has a 1080 screen, if 4K becomes standard for downloads and streaming then poeple are gonna buy a CPU that can handle it, even if it's being downscaled most of the time. Not forgetting people using a laptop plugged into a secondary display as a presentation device.

I guess it's a case of improving the section of CPU that is utilised for video data streams.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

lllllllllllllllllll posted:

Kaby Lake promises less CPU strain playing 4K videos. Is this something that would be delegated to a modern dedicated graphics card (like Nvidia's 10XX series) if you have it? Or is this a useful feature whether you go with built-in graphics as well as for users with dedicated graphic cards?

Although they advertise it as being for 4K video, what's going on is that they're adding support for a newer standard codec, which would still benefit decoding and encoding in that format at 1080p or even 320x240. Most video cards out right now won't happen to support it but it'll eventually be standard there.

The previous generation of CPUs supported the new HEVC codec but only really well for 1080p resolution and 8 bit per channel color depth. Kaby Lake has better support for HEVC that remains useful at 4K resolution and 10 bits per color channel AND adds hardware decoding for VP9 format video as well.

Fuzzy Mammal
Aug 15, 2001

Lipstick Apathy
Is there anyone who is serious about the quality of their output, which you presumably are if you're making 4k, that is satisfied with gpu or intel hardware offload? Also I can't think of anyone in the world who would be pushed over the edge on a buying decision over a feature like this. It's just not something you hang a generation of products on.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I figure most of the hardware accelerated encode / decode is primarily oriented around streaming video use cases such as video teleconferencing and live streaming video games.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Fuzzy Mammal posted:

It's just not something you hang a generation of products on.

In a cynical world where Intel knows that a lot of battery test suites hinge heavily on things like "played a looped YouTube video until the laptop dies," something like this may give noticeable battery life boosts, and saying you're adding 25%+ to the "useful life" of a charge is absolutely something you can hang a mobile CPU generation on. On paper, anyhow.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

DrDork posted:

In a cynical world where Intel knows that a lot of battery test suites hinge heavily on things like "played a looped YouTube video until the laptop dies," something like this may give noticeable battery life boosts, and saying you're adding 25%+ to the "useful life" of a charge is absolutely something you can hang a mobile CPU generation on. On paper, anyhow.

plus its gonna reuse a bunch of the functional blocks for AVC/HEVC anyway so the marginal silicon added is pretty low

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Fuzzy Mammal posted:

Is there anyone who is serious about the quality of their output, which you presumably are if you're making 4k, that is satisfied with gpu or intel hardware offload? Also I can't think of anyone in the world who would be pushed over the edge on a buying decision over a feature like this. It's just not something you hang a generation of products on.

Yeah, this is all about battery life. Dropping CPU power usage watching HEVC video from ~12W to 1W is huge.

SuperDucky
May 13, 2007

by exmarx
My customers in the embedded segment are super stoked about the triple 4k outputs provided by the proc on Sky/Kaby without having to add a dedicated GPU FWIW.

computer parts
Nov 18, 2010

PLEASE CLAP

DrDork posted:

In a cynical world where Intel knows that a lot of battery test suites hinge heavily on things like "played a looped YouTube video until the laptop dies," something like this may give noticeable battery life boosts, and saying you're adding 25%+ to the "useful life" of a charge is absolutely something you can hang a mobile CPU generation on. On paper, anyhow.

If most people really are just watching Youtube videos then that's a useful thing to design for.

Potato Salad
Oct 23, 2014

nobody cares


SuperDucky posted:

My customers in the embedded segment are super stoked about the triple 4k outputs provided by the proc on Sky/Kaby without having to add a dedicated GPU FWIW.

But will any mobo manufacturer put three hdmi/dp outputs on their poo poo?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Potato Salad posted:

But will any mobo manufacturer put three hdmi/dp outputs on their poo poo?

Did they stop doing that? I have a really cheap lovely P67 motherboard from 2011, and it's got displayport, HDMI, and DVI. When I look at mobos now I don't see many that have displayport, what happened?

HMS Boromir
Jul 16, 2011

by Lowtax
Mine has Displayport, DVI and VGA, it wasn't very expensive but it's a Z170. A quick glance at other LGA 1151 chipsets seems to show that you need to go for something higher end if you want even one displayport on H170/B150.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Potato Salad posted:

But will any mobo manufacturer put three hdmi/dp outputs on their poo poo?

Imagine when you have embedded solutions you can order in things like more identical ports.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
If you're going to run three displays, you probably want to go balls out and have them all on DP. For that you currently still need a dedicated card. Having a combo of DP, HDMI and DVI sucks balls.

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you

Combat Pretzel posted:

If you're going to run three displays, you probably want to go balls out and have them all on DP. For that you currently still need a dedicated card. Having a combo of DP, HDMI and DVI sucks balls.
DP will always and forever sucks balls until they update the spec to not have a monitor completely removed from showing up as accessible when you turn it off.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

real_scud posted:

DP will always and forever sucks balls until they update the spec to not have a monitor completely removed from showing up as accessible when you turn it off.

This is an issue with monitors implementing the spec incorrectly.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
Displayport owns and I can't believe how slow it has been to catch on, especially because the licensing fees are cheaper than HDMI as I understand.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Twerk from Home posted:

Displayport owns and I can't believe how slow it has been to catch on, especially because the licensing fees are cheaper than HDMI as I understand.

There are no fees, iirc

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
The cable length limitations on DP limit its deployment scenarios compared to HDMI. Furthermore, companies have already invested plenty into HDMI and its compatibility with TVs, projectors, etc. is undisputed. HDMI max is 30 meters for 1080P @ 60Hz and DisplayPort maxes out at 5 meters for official certifications and anything beyond getting spotty in what bandwidth is actually supported.

SuperDucky
May 13, 2007

by exmarx

Potato Salad posted:

But will any mobo manufacturer put three hdmi/dp outputs on their poo poo?

We do 1 DP external and 2 DVI-D on the board itself. My Asrock z77 extreme4 had HDMI, VGA and DVI.

repiv
Aug 13, 2009

PerrineClostermann posted:

There are no fees, iirc

VESA themselves don't charge any fees, but MPEG LA has patents covering aspects of DisplayPort and charges $0.20/unit to avoid any trouble.

https://en.wikipedia.org/wiki/DisplayPort#Cost

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

real_scud posted:

DP will always and forever sucks balls until they update the spec to not have a monitor completely removed from showing up as accessible when you turn it off.

The spec says monitors should be connected in power-off mode. Most monitors just don't implement the spec right.

Combat Pretzel posted:

If you're going to run three displays, you probably want to go balls out and have them all on DP. For that you currently still need a dedicated card. Having a combo of DP, HDMI and DVI sucks balls.

Yeah, especially since DisplayPort is the standard for "professional" monitors that get used with integrated graphics for your average work PC, and since DisplayPort is a one-way conversion. You can convert DisplayPort to HDMI or DVI, which means that using DP lets you run any modern monitor at a minimum of 1440p@60 or 4K@30, but you cannot go the other way. This means that if your standard monitors are DisplayPort only you need active converters.

SuperDucky posted:

We do 1 DP external and 2 DVI-D on the board itself. My Asrock z77 extreme4 had HDMI, VGA and DVI.

DVI-D pretty much needs to die out like VGA did at this point, since again, you can adapt either HDMI or DVI to DVI. I get hypothetically wanting to have HDMI on there since it's the standard connector for consumer equipment but a HDMI<->DVI adapter is like two bucks.

Paul MaudDib fucked around with this message at 23:26 on Oct 3, 2016

SuperDucky
May 13, 2007

by exmarx

Paul MaudDib posted:


DVI-D pretty much needs to die out like VGA did at this point, since again, you can adapt either HDMI or DVI to DVI. I get hypothetically wanting to have HDMI on there since it's the standard connector for consumer equipment but a HDMI<->DVI adapter is like two bucks.

I don't disagree, but we had to do it that way due to compatibility restrictions. We have a chassis with a built-in monitor on the faceplate that has to have DVI, also, I'd be scared of surface mounting a DP connector on the board, its too tall, not enough support.

e: you couldn't believe the hell we've gotten for not having VGA on this Skylake-S board. The very reason we never went to HDMI was that it didn't have a collar/lock on the connector. DP did, so we bit the bullet. PLUS, iirc, there's no way to pull an analog/vga signal out of the HD/P530 graphics on Skylake-S without an offboard BMC or graphics controller, which there was absolutely no room for on the already slam-full PICMG 1.3 board.

SuperDucky fucked around with this message at 01:24 on Oct 4, 2016

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

SuperDucky posted:

I don't disagree, but we had to do it that way due to compatibility restrictions. We have a chassis with a built-in monitor on the faceplate that has to have DVI, also, I'd be scared of surface mounting a DP connector on the board, its too tall, not enough support.

e: you couldn't believe the hell we've gotten for not having VGA on this Skylake-S board. The very reason we never went to HDMI was that it didn't have a collar/lock on the connector. DP did, so we bit the bullet. PLUS, iirc, there's no way to pull an analog/vga signal out of the HD/P530 graphics on Skylake-S without an offboard BMC or graphics controller, which there was absolutely no room for on the already slam-full PICMG 1.3 board.

VGA and DVI should be actively eradicated, unless you're Dell or Lenovo and making PCs that work in lovely corporations with 15 year old projectors or something.

SCheeseman
Apr 23, 2003

I have a DP->VGA adaptor that cost 5 bucks and seems to work fine, I don't see much point in keeping them around on the actual hardware.

HMS Boromir
Jul 16, 2011

by Lowtax
Isn't DVI still the only way to use those overclockable Korean monitors?

Adbot
ADBOT LOVES YOU

SCheeseman
Apr 23, 2003

HMS Boromir posted:

Isn't DVI still the only way to use those overclockable Korean monitors?
That's the case unfortunately. The only DP->Dual Link DVI adaptors are expensive as hell too, which kinda negates the price of the cheapo monitor.

It's a shame if you have one of those monitors (I do!) but I guess next time I get a new video card I'll probably get something with G-Sync/Freesync anyway. Here's hoping that DP->DLDVI converters go down in price, it'd work really nicely as a secondary.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply