Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Panty Saluter
Jan 17, 2004

Making learning fun!

Don Lapre posted:

I understand you don't understand putting things into holes.

hey, it's all close together and hard to see in there t:mad:>

Adbot
ADBOT LOVES YOU

Anime Schoolgirl
Nov 28, 2002

movax posted:

My boss led development of the PCU that controls FIVR; he gets foamy at the mouth and mutters 'politics and loving Haifa' every time an article about FIVR being removed comes up.

There's huge competition between the US and Israeli design teams, with FIVR tagging along with Haswell as part of the US team's turn at the new uarch. I imagine a very popular anti-FIVR argument is that it eats die space and thermal envelope room without contributing anything, in the sense of it takes the place of transistors that could be cache or additional logic.

Honestly makes sense to me for mobile platforms though, and it's a great piece of IP to own as a company.
Well they're kind of selling the same chips for both laptop and desktop and it makes for really sick battery life on sleep and laptops make up more of the sales volume from what I've read so there's no reason to exclude it if you don't want to make another chip line just for desktops.

Has the FIVR ever popped up on LGA2011 series, though? That's the one place I really hope it doesn't go in as it makes negative sense in that socket's market.

WhyteRyce
Dec 30, 2001

I've installed stock stock heatsinks more times than I can count without breaking anything. Even ones that had a broken peg. I don't understand why people are so mystified by it and complain about the difficulty but then praise aftermarket stuff that requires back plate installation, spacers, gaskets, washers, pegs, etc. Stock heat sinks are cheap but drat if they aren't super easy to handle and require zero tools

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Don Lapre posted:

The intel stock cooler is perfectly fine for stock speeds and is not difficult at all to use. Most people simply have no idea how the drat push pins work and do poo poo like spin them all the way round. The push pins go one way when locked, and turn 90 degrees to release.

To release a stock intel cooler, turn pins 90 degrees and pull up. cooler will come right off. to reinstall just turn 90 degrees the opposite way to reset them. Its so loving easy.

http://www.intel.com/content/www/us/en/support/processors/000005852.html

I've seen reviews where a 4790K will throttle with it, but that aside, I'm referring to not the fact it's difficult to install initially (it isn't), but that it can be difficult to re-fit after years of use, because the plastic surrounding the pins that splays out to fix the cooler has a tendency to stay splayed out. So when you re-fit it, it can feel like you're up against the normal resistance, but instead, you snap one half of the plastic around the pin. I'm not talking about the releasing mechanism, which is easy in itself, although if the plastic has splayed apart and won't spring back, you often have to pinch them together from the underside of the board.

You can tell it's been designed to be fitted once, quickly, and left alone. Indeed it is fine for that, and that will be the majority of cases.

HalloKitty fucked around with this message at 07:54 on Jan 4, 2016

Xir
Jul 31, 2007

I smell fan fiction...
I'm not sure if this is the right thread but I want to ask a question to sanity check myself. I have an i7-4790K running at stock clocks. When I do an h.264 encode the temp in some cores hits 90C. Should I be concerned about this temp or is that still in the acceptable range?

Ihmemies
Oct 6, 2012

I'd sweat profusely if temps were even near 70C. I'd power down my system in panic if temps hit 90C, and check for broken fans.

DeaconBlues
Nov 9, 2011
All this talk of stock coolers has got me wondering if they are all actually the same model? For example, would the cooler found on a 2014 i7 be the same model as the one on a 2014 i3? Are the more recent models better performing, or has the design remained unchanged over the last five years?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Xir posted:

I'm not sure if this is the right thread but I want to ask a question to sanity check myself. I have an i7-4790K running at stock clocks. When I do an h.264 encode the temp in some cores hits 90C. Should I be concerned about this temp or is that still in the acceptable range?

Its perfectly ok. Your cpu can get up to 100c and it will just start throttling at that point. Encoders use some of the same stuff prime95 holy gently caress mode does so you see similiar temps. it wont harm anything.

DeaconBlues posted:

All this talk of stock coolers has got me wondering if they are all actually the same model? For example, would the cooler found on a 2014 i7 be the same model as the one on a 2014 i3? Are the more recent models better performing, or has the design remained unchanged over the last five years?

They are similar but not the same. i7 coolers have copper slugs.

PC LOAD LETTER
May 23, 2005
WTF?!

HalloKitty posted:

So when you re-fit it, it can feel like you're up against the normal resistance, but instead, you snap one half of the plastic around the pin.
Yup. If you snap or bend 1 of the plastic fingers the stock Intel HSF can be a real bitch to mount. They're somewhat delicate and bend easy when you're wiggling the HSF around to try and get the pegs into the holes. The side to side motion you use to "find" the holes with the pegs, which is what most do since space is cramped and lighting is poor inside most cases, is what usually makes them bend or snap.

Yeah the old Socket A/370 style mounts were difficult to use but more modern Socket 939 and AM3 clips aren't bad at all and aren't easily damaged. The lever versions in particular are some of the easiest around to use.

Sky Shadowing
Feb 13, 2012

At least we're not the Thalmor (yet)
Update on my situation:

I am a colossal loving moron. :downs:

The cord had gone over the fan, I believe, and was preventing it from spinning. So no wonder it was loving burning up.

I've kept my eye on it since noticing that at noon and the temp has stayed around 40 or so (still underclocked), I'm going to crank it back up to full speed in a few hours and see what happens. I'm still going to put on the new cooler and such tomorrow because I do not entirely trust my stock cooler anymore and because I love tweaking my pc (though thank god my Antec 900 has a CPU cutout so I won't have to tear my PC apart).

It has basically only been on a few hours total since the issue began, so hopefully damage was basically non-existent.

Thanks for your help, goons.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Modern CPUs are actually very good at preventing you from cooking them, functional fan or not--they'll just downclock themselves like hell when they get too hot. About the only way you can really cook them to death these days is if you have no heatsink whatsoever, and/or you bump the voltage up a good bit for the hell of it.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

DrDork posted:

Modern CPUs are actually very good at preventing you from cooking them, functional fan or not--they'll just downclock themselves like hell when they get too hot. About the only way you can really cook them to death these days is if you have no heatsink whatsoever, and/or you bump the voltage up a good bit for the hell of it.

I recently did that "wires blocking the CPU cooler fan from spinning at all" and the thing made it through several benchmarks, I couldn't figure out why it was topping out at ~40fps in games though. Poor lil guy was throttling itself to stay alive.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
I had my Q6600 machine shut down several times within minutes of starting rendering video recently. Turned out that the heatsink was extremely dusty, to the point that I don't think I've ever seen one so bad before. Not sure what was up with that as I do clean it out once in a while, but the temps got pretty ridiculous before I blew it out again.

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

I suspect that I recently had a Xeon E5 cook itself. The server arrived with a loose heatsink; after some tests it was identified (it throttled to ~1/2 expected performance under load.) After the heatsink was reattached, the CPU would only perform at 2/3rd peak. The problem followed the CPU to different heatsinks or sockets (correct thermal paste, etc.). I expected that the throttling would've prevented any permanent damage, so it was surprising to see ongoing performance issues.

EdEddnEddy
Apr 5, 2012



I think the throttling can save it when it is able to at least vent some heat, a heatsink that is attached properly but with no fan should still work good enough to keep from damaging anything, but a heatsink not attached is much more likely to kill it as it has nowhere for that instant build up of heat to go quick enough.

I did see a kid at a local lan with a new 4770K couldn't figure out how come his PC wouldn't play CS:GO at 200FPS like he was told it would. Asking around and people are checking all over with drivers, windows install, etc that nobody decided to check the temps. I knew it looked like a throttle issue as it would play at >100 FPS for a few seconds, then drop down to the 30-50FPS range for the remainder of a match.

Take a look at the temps, yep 99C. Heatsink was hanging by a single one of the pushpin arms and they were turned to be released. The kid said he followed the arrows.

Set him straight and the comp went on to play in the 40-60C range and 200+FPS like he had been told.


I agree that some people just shouldn't build or work on their own PC's even in this easy as pie day and age. Also the stock coolers used to be better IMO then the ones they have now. The ones that came with the early Core 2 Quad or 1st Gen i7's were massive stockers and with the copper core, the things were practically an upgrade for a lower end CPU that came with only an aluminum core and the shorter fins. I was able to overclock a few lower end C2Q's and i7's later with the "upgraded" coolers that actually allowed a modest OC without creating as much heat as the one that they came with.

Of course any actual OC needs a good aftermarket, but for the most part, the stocker is just fine at stock clocks as long as you just install, make sure the pins go into the holes, and push the 4 pins and hear the drat click.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Broadwell-E is still slated for Q2, right? Any more details yet whether it is start or end of Q2? Hoping the speculation of 600ish bucks for an eight core one turns out true.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD

mobby_6kl posted:

I had my Q6600 machine shut down several times within minutes of starting rendering video recently. Turned out that the heatsink was extremely dusty, to the point that I don't think I've ever seen one so bad before. Not sure what was up with that as I do clean it out once in a while, but the temps got pretty ridiculous before I blew it out again.

this is a big one, especially homes/offices that go through a renovation while still in use. drywall powder is loving horrific.

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you
I recently decided to use some of my bonus money to build a new machine, figured I hadn't upgraded in about 5 or 6 years and may as well do it now.

This was my build:
1tb Samsung Evo 850
ASUS Maximus VIII Hero
i5 6600K
G.Skill Ripjaws V 32GB (4x8gb) DDR4 3000 1.35v
Hyper 212 EVO
Nvidia GTX 980ti

Got pretty much everything installed and running quickly since this isn't the first system I've built.

However I'm not sure if it's my mobo or these new Z710 chipset but I've never had a mobo have a problem with a posted speed of RAM before. Initially I ordered some older Ripjaws that I guess were originally meant for X99 boards since their speed was only 2400 mhz and they only took 1.2v.

I had some issues getting the XMP working properly and the board booting so I ordered ones that Asus themselves had said were verified compatible with the board. The new 3000mhz sticks arrived and I put them in, sure enough they had the same problem. If I went with the stock speeds that XMP was supposed to work on them, the board wouldn't post and would fallback on the failsafe speed of 2133mhz.

It was only after some googling did I come to see that the 2133mhz is actually the stock speeds for these sticks and any other speed that's quoted is basically overclocking, albeit OC straight from the manufacturer. After also reading that if you gave it a bit more voltage, or turned the speed down slightly that could help I did mange to get the board to be stable.

So right now I'm running the memory at 2933mhz with 1.36v and things are stable.

My question to you guys is this. Is this a normal thing with every Z710 board or do I have either a faulty RAM, or faulty mobo that I should return?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Stable Memory speed is dependent on the cpus memory controller and its a bit like the silicon lottery.

dud root
Mar 30, 2008

real_scud posted:

G.Skill Ripjaws V 32GB (4x8gb) DDR4 3000 1.35v

My question to you guys is this. Is this a normal thing with every Z710 board or do I have either a faulty RAM, or faulty mobo that I should return?

I've got the same issue with the same RAM but specced at 3200. It will only run at 3000 speeds and any higher will cause the mobo to boot at 2133

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you

dud root posted:

I've got the same issue with the same RAM but specced at 3200. It will only run at 3000 speeds and any higher will cause the mobo to boot at 2133
Have you tried upping the voltage slightly? That was one tip I heard that has kinda worked for me since apparently some mobo's only give it 1.345 and not 1.35

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Any reason someone would want faster than DDR4-2133 on pre-Skylake anyway? I've seen the Anandtech benchmark and the differences of all variants between 2133 and 3200 (or 3400, don't remember) went under in the error noise. Only memory specific benchmarks showed slight improvement. Probably better to spring for more CAS instead?

Anime Schoolgirl
Nov 28, 2002

if you want ddr4-3200 to actually be usable you should have went on the x99 platform as the memory controllers on those aren't lovely

on skylake in general you don't want to go over 2400 because 1) there's no real benefit on cpu function with the exception of border cases 2) the memory controllers cannot actually handle speed that high (your sample is actually one of the better ones, most skylake chips poo poo their pants at 2700-2800) 3) you're paying more for better integrated graphics performance, which you're not using

unless you want to get sick framerate on fallout 4, then by all means

Anime Schoolgirl fucked around with this message at 15:11 on Jan 5, 2016

HMS Boromir
Jul 16, 2011

by Lowtax
These guys found significant differences in game performance between DDR4-2133 and DDR4-2666 with an i3 6100 but they seem to be the only ones and I don't know if they've tested it with any other CPU so v0v

feedmegin
Jul 30, 2008

Lord Windy posted:

How do you guys use so much RAM? I have 16 and through normal use have never used more than 8

Any modern OS will use unused-by-applications RAM for disk cache, so having extra should still give you some benefit.

japtor
Oct 28, 2005
New NUCs announced, there's the usual ones but also going to be a quad core Iris Pro model with TB3:
http://arstechnica.com/gadgets/2016/01/intels-next-nuc-will-be-a-quad-core-mini-pc-with-iris-pro-and-thunderbolt-3/

Somewhat related if you haven't seen it yet, Razer announced a TB3 GPU enclosure/dock:
http://www.razerzone.com/gaming-systems/razer-blade-stealth#gpu-support

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Ah poo poo, found out that Intel plans to release Broadwell-E with Computex. That's June plus whatever time it needs to trickle down to the shops. :(

movax
Aug 30, 2008

japtor posted:

New NUCs announced, there's the usual ones but also going to be a quad core Iris Pro model with TB3:
http://arstechnica.com/gadgets/2016/01/intels-next-nuc-will-be-a-quad-core-mini-pc-with-iris-pro-and-thunderbolt-3/

Somewhat related if you haven't seen it yet, Razer announced a TB3 GPU enclosure/dock:
http://www.razerzone.com/gaming-systems/razer-blade-stealth#gpu-support

Hm, I bet the i5 version of that would make a decent HTPC + Steam remote play box. Wonder if it supports 4K60 output.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




japtor posted:

New NUCs announced, there's the usual ones but also going to be a quad core Iris Pro model with TB3:
http://arstechnica.com/gadgets/2016/01/intels-next-nuc-will-be-a-quad-core-mini-pc-with-iris-pro-and-thunderbolt-3/

Somewhat related if you haven't seen it yet, Razer announced a TB3 GPU enclosure/dock:
http://www.razerzone.com/gaming-systems/razer-blade-stealth#gpu-support

Why are all of these video card enclosures always so big? This one is at least smaller than the Alienware one, but these thing are the size of full mini-itx computers.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

With graphics cards the size they are its hard to go smaller. And yes I realize there are itx options for some cards but there's not that many.

You can probably also guess what dictates the size of itx enclosures.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I do notice that Razer is quick to put a '40 Gigabits is a lot, look at this tall bar' graph up there to sell the concept to people, but 40 gigabits per second is 5GB/second (and that's saturating the whole available bus). PCIe 3.0 x16 is ~16GB/sec, and 2.0 x16 is half that. For all intents and purposes, Razer's TB3 box will give you PCIe 3.0 x5-equivalent performance - at the most - and that doesn't even really factor in the (probably infinitesimal) lag you'll most likely get by piping in a video signal over a highly-glorified TB3->PCIe bridge chip.

Given that the laptop they're trying to pair it to only has QHD and UHD options...I think they're pulling a snow job.

BIG HEADLINE fucked around with this message at 09:48 on Jan 8, 2016

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

BIG HEADLINE posted:

I do notice that Razer is quick to put a '40 Gigabits is a lot, look at this tall bar' graph up there to sell the concept to people, but 40 gigabits per second is 5GB/second (and that's saturating the whole available bus). PCIe 3.0 x16 is ~16GB/sec, and 2.0 x16 is half that. For all intents and purposes, Razer's TB3 box will give you PCIe 3.0 x5-equivalent performance - at the most - and that doesn't even really factor in the (probably infinitesimal) lag you'll most likely get by piping in a video signal over a highly-glorified TB3->PCIe bridge chip.

Given that the laptop they're trying to pair it to only has QHD and UHD options...I think they're pulling a snow job.

it turns out that graphics is barely affected by halving or quartering the pcie link bandwidth: https://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/

this is not really an issue even at 4k. nobody is waiting for slow-rear end pcie transfers when they are rendering frames.

japtor
Oct 28, 2005

VulgarandStupid posted:

Why are all of these video card enclosures always so big? This one is at least smaller than the Alienware one, but these thing are the size of full mini-itx computers.
Linus tech tips has a video on it where they pull out the core, it doesn't seem too much bigger than it has to be. It's enough for a big rear end card and a 500W PSU next to it and doesn't seem like it's wasting a lot of space.

BIG HEADLINE posted:

I do notice that Razer is quick to put a '40 Gigabits is a lot, look at this tall bar' graph up there to sell the concept to people, but 40 gigabits per second is 5GB/second (and that's saturating the whole available bus). PCIe 3.0 x16 is ~16GB/sec, and 2.0 x16 is half that. For all intents and purposes, Razer's TB3 box will give you PCIe 3.0 x5-equivalent performance - at the most - and that doesn't even really factor in the (probably infinitesimal) lag you'll most likely get by piping in a video signal over a highly-glorified TB3->PCIe bridge chip.

Given that the laptop they're trying to pair it to only has QHD and UHD options...I think they're pulling a snow job.
TB3 uses PCIe 3.0 x4 plus whatever overhead, so even less! Here's some more PCIe scaling benches for reference though:
http://www.techpowerup.com/mobile/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/21.html

I am curious about the bandwidth when outputting to the internal screen though and how that affects performance. Off the top of my head at worst that'd eat a bit over a third of the bandwidth?

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Malcolm XML posted:

it turns out that graphics is barely affected by halving or quartering the pcie link bandwidth: https://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/

this is not really an issue even at 4k. nobody is waiting for slow-rear end pcie transfers when they are rendering frames.

This is because, as I understand it, because you load the entire dataset of what you might want to render from into the memory of your gpu. A smaller pipe from pc to egpu should only mean slightly longer loading times.

Rastor
Jun 2, 2001

ASUS is claiming that using Thunderbolt does add noticeable latency, so they custom built a PCIe-over-USB-C solution that they say achieves 99.99% performance of PCIe x16.

Hopefully somebody will benchmark both solutions.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


I'm shocked it's taken so long for someone to market a commercial external GPU. Most gaming laptops are disgustingly gaudy and it'd be nice to be able have a Thinkpad or MacBook play real games.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




Tab8715 posted:

I'm shocked it's taken so long for someone to market a commercial external GPU. Most gaming laptops are disgustingly gaudy and it'd be nice to be able have a Thinkpad or MacBook play real games.

The biggest issue was the lack of interface, but USB 3. I seems to be good enough for it. Alienware released an external gpu enclosure fairly recently, but used a proprietary port, which meant most people couldn't use it. Before that, there was some express card stuff going on, but I was limited to something like PCI-E 2.0 2x.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Thunderbolt has been around for 2-4 years, I don't see why we couldn't use that interface.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Tab8715 posted:

Thunderbolt has been around for 2-4 years, I don't see why we couldn't use that interface.

Because nothing ships with Thunderbolt besides a few random Sony laptops and Apple computers. The Sony laptops did use it for external GPU stuff, but IIRC they weren't that good.

Adbot
ADBOT LOVES YOU

sadus
Apr 5, 2004

Tab8715 posted:

I'm shocked it's taken so long for someone to market a commercial external GPU. Most gaming laptops are disgustingly gaudy and it'd be nice to be able have a Thinkpad or MacBook play real games.

For what its worth, my $99 Kangaroo with an Atom CPU can do the Steam In-Home Streaming thing from my real PC pretty much OK, it only looks a little messed up once in a while briefly. I bet any laptop or even a NUC could do it flawlessly.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply