Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Space Gopher posted:

When you look at overall system performance, though, the downsides (increased noise, heat, power consumption, and cooler cost) often outweigh the "benefit" of a CPU that's just going to be bottlenecking harder rather than running faster.

Considering that I managed, with a $24 cooler, to increase my i7 920's speed from 2.66GHz to 3.15GHz *and* decrease its VCore by around 15% from stock... I think I actually probably came out ahead on performance, heat, and power consumption for a fairly minor outlay in cooling cost.

Anecdotal, yes, but it seems to me that minor overclocking is quite a nice risk/outlay:reward ratio.

Adbot
ADBOT LOVES YOU

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Wedesdo posted:

Well, it's at 1.38V and 4.8GHz right now. I shouldn't get too greedy, right?

That's really good from what I've read, so if you kill it and get a replacement I wouldn't expect to be able to take the replacement that high.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Sidesaddle Cavalry posted:

What is the consensus about Intel's NUC units as of late? Are they a decent solution for a user who might need to set up a DIY-built desktop in various places while being able to pack it up in a backpack every once in a while?

I'm asking because I'm reading rumblings here and there about NUCs possibly making a huge jump in performance once they get Skylakes with Irises in them more available, with a big whopper with the top level Iris 580 Pro possibly making it into one :awesome:

also how do they compare to steam machines

I have a Broadwell i3 NUC (NUC5i3RYH) with a 128GB M.2 SSD and 2x8GB SODIMMs and it does a great job with Windows 10, Ubuntu, or Fedora running office applications, video conferencing stuff and lots of telnet/SSH sessions. It's responsive and I haven't had any problems doing the same things on it that I do on my main work system, a 13" Haswell Macbook Pro. As long as you have realistic expectations for a mobile dual-core and integrated graphics I think it's a good system, but if you have any interest in gaming or heavy computing loads on it you'd probably prefer a mini-ITX system with a real GPU and/or desktop processor. There's not really a cost benefit to getting a NUC versus a desktop (especially if you mean price/performance and not just raw price), it's really just about size/noise/power consumption.

Edit: Having it has made me consider getting the Celeron model to replace my Raspberry Pi as a torrent box so I can get USB3.0 and gigabit Ethernet, but I haven't yet decided that a 10x increase in download speed from the torrent box is really worth $150.

Eletriarnation fucked around with this message at 16:29 on Dec 9, 2015

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

japtor posted:

Hows the noise with NUCs nowadays? From the earlier posts I'm guessing quiet enough. Just asking cause I remember the early models could be annoying in that regard according to reviews.

I have a Broadwell i3 and it is silent when idling, or at least inaudible over the HVAC in the server room on the next floor up. It's also silent with 4 threads of Prime95 going and hovers right under 70C. I don't know what it's like if you're taxing the CPU and GPU but I imagine it can't be that bad.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Psmith posted:

So I'm finally considering upgrading my CPU/MOBO and I have a few questions.

I have an i5 2500k that I bought about 4 years ago. I have recently upgraded my GPU to a 970 and I've been keeping my eye on Oculus (and VR tech in general) so I kind of want to get my CPU up to date. The main selling point for me here is that because the 2500k is still a good chip that allows me to run mostly everything at higher quality, I will use the current board/chip to create a secondary gaming PC for the living room.

So with that being said, I'm thinking about upgrading to the i7 4790K as it is $300 at Microcenter. I would get a new board as well. Getting the new board is because when I built this machine originally I got a small form factor board to fit my smaller case at the time. This has been something of a hassle as I've continued to upgrade so I feel like it's time for a full size motherboard.

So I have a few questions:

1) Is the upgrade even worth it? I've done a little research on google and the responses seem a little mixed. It seems that I could OC the 2500k fairly easily but I'm currently using stock cooling. So I'd have to try and fit an aftermarket cooler on my small form factor mobo. The recommended CPU for Oculus is "i5-4590 equivalent or greater" so that's my target. Oculus isn't my only concern but I believe it would be the most intensive one.

2) This is always my concern with buying a new expensive computer component but is there some wonderful brand new CPU tech on the horizon that is worth waiting for?

Thanks!

I have an i5-2500k with a cheap air cooler (Hyper 212) and can run it at 4.4Ghz on stock voltage, changing no settings but the multiplier. It might be able to go higher, I've never tried. I don't know if you could do this effectively on the stock cooler but you could probably find an upgrade that would work in your case, especially considering your power consumption won't increase that much if you don't change voltage.

If you do have a 2500k with a good overclock, it's not going to be holding you back any appreciable amount. See these benchmarks with a Skylake vs. a much older Nehalem i7 965. This system has the additional handicap that Nehalem only supports PCIe 2.0 and it still manages to mostly not be bottlenecked much more than an overclocked Skylake, using a Titan X.

There isn't much on the horizon that's nearby and confirmed as far as CPUs go.

Eletriarnation fucked around with this message at 07:38 on Jan 13, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Psmith posted:

I appreciate all of the replies - they confirmed a lot of what I had kind of suspected: that the 2500k is basically amazing and I could continue to stretch it out even now 4 years later.

Yeah, this is definitely true. I found an article the other day comparing an i7-965@3.67GHz, which is a few years older than the 2500K, to a 6700K at 4.7 using a Titan X to maximize any CPU bottlenecking. The absolute highest difference I saw was around 20% and most games were less than 10%. Considering that I am only using a 7850 for the moment I'm at least going to wait for Pascal/Polaris to see if there's any reason to consider a platform replacement.

Boiled Water posted:

How well is L4 cache transferred to L3?

I think it's typical for each level of cache to take a CPU cycle or two to load information from the next level down, but that's a lot better than the 10+ cycles that going to main memory takes. Not sure if there are any articles getting deep into Broadwell architecture that could clear it up.

Ed.: The graph here seems to indicate that it's a lot more than one cycle but still only around half the time that accessing main memory takes.

Eletriarnation fucked around with this message at 16:40 on Jan 15, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

AVeryLargeRadish posted:

six cores is awfully tempting if you do heavily multithreaded workloads.

This was my thought unless you mean something different by "I'm interested in what i7 has to offer". An LGA2011 chip would give you full extra cores in addition to the fractional capabilities of the HT logical cores and would still be overclockable, although not quite as high as four-core chips on average.

On your first question though, I don't know of any way in which desktop Haswell chips are superior to Skylake except price and ability to handle certain large exponents in Prime95.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Kazinsal posted:

Windows client doesn't do PAE on 32-bit anymore. Hasn't since XP SP2. 32-bit Server does, and PAE is a literal requirement to enter 64-bit long mode, so that one is a given.

There's a highly unofficial kernel patch for Windows 7 to enable PAE but ymmv on using unofficial kernel patches in a production environment.

That's really surprising that PAE isn't used on 32-bit because Windows 8.1 or 10 32-bit will throw an error if you try to install them on processors without PAE. I tried with a Pentium M 725 and had to upgrade to a 760 or some such to get PAE support so I could upgrade from XP.

...at least, I think it was PAE. Ark says that both models only have 32-bit PAE support but I feel like there was some obscure detail like the 725 not even advertising support for the extensions where the 760 did or something like that. A bit of Googling indicates that it could also have been NX or SSE2 though.

Eletriarnation fucked around with this message at 16:03 on Jan 19, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Ervin K posted:

Is there not going to be a desktop 6 core like there was with haswell? I've searched around on google and am getting some mixed messages.

according to this list: https://en.wikipedia.org/wiki/Haswell_(microarchitecture)#Desktop_processors
the 6 and 8 core ones were just a few months behind the rest for haswell.

edit: actually it seems it was at least a year behind some of the earliest haswell releases so I guess that explains that.

Either way it looks like 6700k is more than twice as good as my 3570k for certain workstation applications so I'll probably go with that.

The 6-core and 8-core chips are a different socket/platform and basically just a rebranded Xeon, yeah. They come out a while later when the Xeons come out for a given generation. I believe this has been the case since Sandy Bridge, as Core 2 didn't have a separate line of prosumer chips for the Xeon platform and the first-generation prosumer i7 chips (Bloomfield, i7-9xx) actually preceded the normal desktop/laptop lineup for that generation.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

NihilismNow posted:

Is there any chance that Xeon E5-v4 will be compatible with LGA 2011-v3 or will we be forced onto LGA-2011v4 motherboards? On the desktop broadwell and haswell shared a socket. I do have a C612 motherboard so a potential upgrade to broadwell-e would be nice (although i am fine with Haswell-E).

Ivy Bridge-E and Sandy Bridge-E share a socket so I think it's quite possible.

Don Lapre posted:

If you arn't overclocking get a 6700k, if you dont mind overclocking get a 5820k and oc it to 4-4.5ghz

Why get a K chip at all if you aren't overclocking? I know the regular 6700 is a little bit slower, but is it worth the extra cost if that's all you get?

Eletriarnation fucked around with this message at 20:01 on Jan 29, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Xir posted:

I got a 4790K and am not overclocking because I wanted 4GHz and the OPTION of overclocking. I bought a Z board and a good cooler and I run it stock at the moment. The K processors have nice base clocks and that might be enough reason to buy one and pair it with an H board.

I interpreted the bit I was quoting as "if you are open to the idea of overclocking get a 5820K, if not get a 6700K" which doesn't really jive with what you just said, though. It's unquestionable that you have more options if you get a K processor to start, but if you truly don't ever plan to overclock then by getting a 6700K you're paying a substantial premium (not with MSRP but with current retail prices, taking into account that you have to buy a cooler too) for a few hundred MHz.

If you hear "If you aren't overclocking get a 6700k" as "hey it's a good processor and you might want to overclock it but you don't have to right now" then that applies just as well to the 5820K and brings me right back to questioning the rationale of the statement.

If you buy an H board and put a 6700K in it, then you've paid more than you would have getting a 6600K and a Z board (and not a lot less than if you had just stayed with the 6700K and gotten a Z board) and you can't overclock at all on a processor that has a premium for that exact purpose. If you ever decided you wanted to overclock you'd then have to go buy a Z board and spend a lot more in total than if you had bought it in the first place. Sounds like a mistake but maybe that's just me.

slidebite posted:

I have a question about dual channel memory.

A friend of mine has a laptop which appears to be an Intel HM87 chipset. It has 4 DDR3 slots. 2 empty ones are easy to get at but the other 2 are buried deep and require a significant amount of taking apart. He wants to put in another 8GB and it appears the"paired" slots are next to each other - IE the 2 that are easy to get at are a pair, the other 2 which are buried are also a pair. Problem is this: The OEM put the stick it already in the slot under the keyboard, so to properly give it it's dual channel partner it's going to be a pain in the rear end.

Any real downsides to running the extra stick(s) in the easily accessible slot(s)? My gut is telling me no because it's already only running with 1 stick in the dual channel slot so I can't imagine it would make a practical difference but I don't know for sure.

Depends on what you're doing with the laptop, but you're right that it won't get slower than it is now. It probably won't make a noticeable difference for basic multimedia/web browsing/office use stuff, and not much for gaming.

Eletriarnation fucked around with this message at 18:05 on Jan 31, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

JawnV6 posted:

Mismatched sticks run at the slower of the two, so it is possible to degrade overall system performance by adding unpaired RAM. I don't see why the physical accessibility matters, you can get the timing information about that channel's current configuration through software.

You're right, I was assuming that he had matched sticks since he was planning to dual-channel it before he saw the physical position of the slots. If you want to collect SPD info without being able to look at the stick try CPU-Z.

It's also the case that even if he had a SPD mismatch and putting in the second stick loosened the timings, he probably wouldn't notice that in real-world applications.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

fishmech posted:

They work. Of course they won't support DirectX 12 stuff, since they're 8 year old cards now, but they'll do all the same stuff they did in 7 or 8.

My source is that they're what my brother was using in his Windows 10 computer, until he finally got a better system last month.

Ehhhhh... yeah... they work, but there's a big caveat speaking as someone who has a 4350, 4650, and 4850 and has tried all in 10.

If you use HDMI, there's like a 7% overscan on AMD cards by default that is normally disabled in Catalyst Control Center. Catalyst Control Center will refuse to start with the Microsoft-provided drivers, which are your only option in Windows 10 on a pre-5000 series card. Your only ways in Windows 10 to actually use your whole screen and not get black bars (which drive me nuts, personally) is to either use DVI/VGA instead or to do registry hacks to disable the overscan.

I haven't let this one issue keep those systems stuck on 7 or force me to buy a new GPU - the registry hacks aren't that hard and are effective, you should just be aware that this is a thing before you upgrade.

Coincidentally, the newest cards available for AGP are 4000-series so if you still have a working AGP desktop and want to get hardware decoding to be able to play web video on an ancient CPU in Windows 10, it's good to be aware of this too.

Eletriarnation fucked around with this message at 07:13 on Feb 9, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

mayodreams posted:

:stare: :stare: :stare:

I think you should be more concerned about how much longer your 8 year old hardware will hold up.

Coming up on 12 year, actually. It's an Alienware desktop from mid-2004 with an ASUS Socket 478 motherboard and a CT-479 adapter to be able to run a Pentium M and overclock it with desktop-class cooling. My first real gaming desktop and quite nice in its day, only replaced at the end of 2008 when Nehalem came out.

I'm not concerned about it because it's not my primary system, that's a 2500K, but it still works great so I put Windows 10 on it to see how it would hold up. It's not bad with 4GB of memory, I could use it in a bind if all my other desktops somehow broke. Hardware decoding for some video works on the 4650 that I found for $25 on eBay, but anything that has to use software decoding will run like poo poo on a single-core processor.

The more impressive thing to me is that the original 80GB Seagate 7200.7 SATA drive attached to it still works perfectly with no bad sectors. Dog slow compared to a new drive but I can't complain about longevity.

blowfish posted:

I've never seen an AGP slot in my life. Are you sure they still exist? :psyduck:

They haven't for years but if you have one there's not much you can do about it, PCI is a lot slower and the selection doesn't improve much.

Eletriarnation fucked around with this message at 15:33 on Feb 9, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Generic Monk posted:

i'm pretty sure the time you would waste getting an agp-era desktop to play nice with windows 10 (on top of waiting 5 minutes each boot etc) would probably be better spent in gainful employment to pay for a new computer

Honestly, it works perfectly with no extra trickery needed except for the aforementioned overscan issues with the 4650. Drivers installed automatically for literally every other part in the system, and I haven't noticed that bootup is any slower than with other systems that still boot off HDDs. The only thing keeping me from putting it to use is that I have 4 newer desktops and just don't have a need for it, and the only reason I bothered in the first place is that I was curious to see if there was any reason it wouldn't work.

There is actually a reason it won't work if you get the earlier Pentium Ms, the 7x5 series - it's either lack of PAE support or the NX bit, but you can't go past Windows 7. No problem once I found a 760 and swapped that in though.

Panty Saluter posted:

GeForce 6800, if memory serves. I don't upgrade super often.

Yeah, I don't either. My upgrade path went like this:
Radeon 9600 XT (came with AGP desktop in 2004) -> 6800 GS (in 2006 - unlocked to Ultra, too!) -> Radeon 4850 (in Nehalem system, 2008) -> 7850 (in Sandy Bridge system, 2012) which I'm still using now. Polaris or Pascal will probably motivate me to replace the 7850.

I actually ended up selling the 6800 GS on SA-Mart well after I upgraded to the 4850 and swapped in an old Radeon 9550 just to keep the AGP system running, but after I put Windows 10 on it (which also worked just fine with the 9550) I decided I should probably max out the final part of the system just for kicks and bought the AGP 4650. As far as I can tell, better cards were never made for AGP and honestly I'm surprised they kept it alive past 2006 or so when NVIDIA dropped it.

Eletriarnation fucked around with this message at 17:25 on Feb 9, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

VulgarandStupid posted:

It's been literally 10 years since AGP was phased out... If anyone is still using one, your smart phone is probably faster.

Oh, it totally is. Still cool that Windows 10 works on it. Works on my single-core Atom N450 netbook also and that supports 64-bit too, although with only 2GB of memory maximum it won't make much difference.

Honestly, Vista was kind of a high water mark for system requirements. We're almost back down to just what you would need to make XP run smoothly at this point, I actually wonder if 10 would run on a P3-1GHz with 512MB of memory.

Actually nevermind... if it didn't work on a first generation Pentium M, there's no way in hell it will work on the processor that was developed from. drat you, NX bit!

Eletriarnation fucked around with this message at 17:53 on Feb 9, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Panty Saluter posted:

I hate to sound like a broken record, but an Accelero worked wonders on my 4850. The 4xxx was designed to run hotter than hell anyway (not that running it cooler isn't helpful)

I have this ASUS model with a good-sized copper cooler on it and don't remember ever having a problem when it was my main although I broke it out recently and tried Furmark which caused it to crash. One cleaning and regreasing later and it's just fine, probably doesn't go over 75 at load.

I did have issues with my 6800 GS, which was an XFX model with a single-slot cooler that sucked rear end. It would get to 120 when playing BF2 before locking up and crashing the system. I replaced that with an Arctic Cooling 2-slot blower and it didn't even hit 60 under load anymore, that thing was incredible.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

eggyolk posted:

I've been meaning to ask this for a while.
My workstation PC has a 5820K in it with a 240mm AIO liquid cooler. I built it myself and it runs great for Solidworks stuff. During rendering it clocks 4.4 at 50C.
Problem is that I've been trying to get it to run at 3.6GHz default because some programs don't trigger the turbo boost and it runs slow as poo poo. After adjusting the bios to a higher minimum clock, it seems to switch between 1.2Ghz and 3.6 at a very high frequency according to Intel Power Gadget. Is this safe for the CPU? It switches frequencies several times per second and I'm worried it's damaging things. How do I get it to run at a constant bottom end speed?

This feature is totally normal and actually has a name, SpeedStep - it's been around a while, at least ten years on mobile processors. It allows the processor to save a lot of power when it's not loaded, and definitely won't cause damage since it's just underclocking and undervolting the processor on the fly. It should stay at full speed under load though.

Some motherboards have a feature to peg the processor at full speed (actually, Windows might too in advanced power options) but it doesn't really get you anything but a higher power bill unless your proc isn't staying at full speed when loaded for some reason.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
I have the mini-tower (microATX) Inspiron 530 and from what I recall whether you can use a Core 2 Quad or not depends on whether you have the G33M02 or G33M03 motherboard - they're identical, except for the power regulation which is less robust and not rated to go over 65W on the M02. I have the M02 and if you put in a quadcore (even a 65W Q8200S) it won't power on. It works great with the top-end C2D E8600, but that's not a lot better than what you have.

If you have the 03 version then a quad off of eBay would be a big upgrade with a new graphics card. If not, you could get a new GPU but will definitely want to look at a CPU upgrade in the near future.

The PSU pinouts on the motherboard and mounts in the case are standard ATX so you can buy any old PSU to use. Of course, all of this applies only to the mini-tower; I think there's a SFF model and things will be much more restricted there.

Ed.: Dell support thread on the issue.

Eletriarnation fucked around with this message at 20:09 on Mar 11, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Ihmemies posted:

It has no real gpu. What do you do with a box like that? Is it meant for people who live in 10m^2 apartments in Manhattan?

I use the Broadwell i3 model as a work desktop. It takes up no space on my desk, supports 2 monitors and all the connectivity I need, and uses very little power.

This new Skull Canyon model would actually be pretty capable for light gaming if I had to guess. I've played MGSV on a desktop with a Haswell low-power quad using integrated graphics and it was alright at 720p/low detail. A lot of older or indie games would work just fine too, and the mobile quad is going to be almost as fast as a desktop Skylake quad since it's not likely to be thermally throttled in that form factor.

Of course, it probably won't compare to a discrete card costing $150+ but that card also uses as much power as the whole unit. You could add a Thunderbolt dock like the Razer Core, but by the time you do that you're using all of the power, money, and space of a real mini-ITX gaming desktop so you'd have to have another reason to go that route.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Gorau posted:

I can pray that it is though right?

If Intel had a way to make n threads work faster by adding more than n cores for any given value, then it would be the biggest news in CPUs in 10 years at least. Not being able to increase single-threaded performance further was why Intel abandoned the P4 NetBurst architecture and started increasing core count.

The article implies that Skylake may still be pooling some sort of shareable resource like cache more effectively for use on single-threaded application and on a really cache-sensitive test I can see that making a big difference, but I am skeptical that they've found the One Weird Trick of making one thread run like lightning on four cores.

Furthermore, this article is from August so anything incredible should have popped up on a lot more sources by now.

Eletriarnation fucked around with this message at 17:36 on Mar 27, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

FunOne posted:

Nope, good call. Was looking at an order for someone else for 8 gigs, it was 75 bucks to get another 16 gigs. But still, who cares! With just Chrome + other background poo poo I'm running near 16 right now.

Keep in mind that used memory != needed memory; most OSes will happily cache up stuff that was used recently but not at the moment because it's of more potential use than just letting the memory go free, but if there's demand they will dump cached items and free the memory for programs that actively need it. As an example I'm sitting at 9.2GB in use out of 16 right now but while it says there's only 6.8G left "available", 6.5G of what's used is cache so I am actually over 13GB from running out.

You may well be already aware of this but anyone reading should know that Windows will happily use as much memory as you give it, within reason. There was a lot of hate about this when it started/ramped up in Vista because people saw all of their memory that was idle in XP suddenly being used and thought "drat, Vista is a memory hog!" when it was just being proactive.

Eletriarnation fucked around with this message at 21:58 on Apr 12, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

wipeout posted:

This is great advice; will Intel be releasing a new chipset for broadwell e? Not that there is anything I'd want that isn't in x99 already.

It is possible but seems unlikely, as Sandy Bridge-E and Ivy Bridge-E shared the X79 chipset and as far as I know X99 supports Broadwell-E.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

EdEddnEddy posted:

How much does this cost over the free Windows Backup method? I could see a direct clone getting you up quicker but at what cost?

Macrium has a free edition that can do this.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

mobby_6kl posted:

Is it at all possible to overclock the i5-2500 non-K on Q67? I might be able to get such a machine for very cheap/free from work. Even stock it would be much faster than my C2Q and really would be good enough to scrap my Skylake plans altogether if oveclockable.

No. Even in a Z68 or whatever the right chipset is, the max multiplier is locked at whatever stock is. Without a Z68 chipset I'm pretty sure you don't even have the option to change the multiplier. You can't change BCLK because other things in the system like the PCIe ports will bug out, so you're stuck at stock.

Eletriarnation fucked around with this message at 21:18 on Apr 14, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
Well, there actually is a newer version of the T100 - the T100HA has Cherry Trail, 64GB of storage and 4GB of memory compared to Bay Trail/32GB/2GB in the T100TA. It's not really cheaper though unfortunately.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Tab8715 posted:

I'm not following, weren't netbooks an abysmal failure?

How do you define that? ASUS is still making them so I assume there's a market, and the smaller Chromebooks seem like basically netbooks to me - they're just running a particular OS that allows a manufacturer to cut the bill of materials even further.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Combat Pretzel posted:

Oh my mistake then. I'd figured that the Xeon line would have some close equivalents. I guess the QPI stuff makes it expensive.

NihilismNow posted:

I thought "you need the Xeon featureset? gently caress you pay me" made it expensive.
If you need dual socket, ECC, etc where are you going to go? Exactly, nowhere. So pay up or scrounge some old poo poo from ebay.

Yeah, QPI is the replacement for front side bus since Bloomfield/Nehalem in late 2008 and present in consumer chips as well. Xeon is expensive for the same reason that Windows Enterprise editions are expensive, it has features that businesses want and will pay a premium for and there's no easy alternative.

On the topic of old eBay hardware, does anyone know if the wattage readout for the proc in HWMonitor is expected to be accurate? I have an old i7-920 that had been semi-retired as an HTPC, but I recently found out that the motherboard supports a 60W Xeon L5520 and found one on eBay for $9 so that I can tinker around with it as a home server. However, I started experimenting with undervolting the i7 while I wait for the Xeon to arrive and found that my 920 is stable at .9V if I cap it at 2.4GHz (18x). At those settings, HWMonitor reads 27W under full load. This is a processor with a 130W TDP so I'm having a hard time being confident that 27W is accurate, but if it is I'm also concerned that the Xeon won't even be an improvement.

ed.: I guess this might be really a question about how accurate the sensors are that HWMonitor is reading from, but I'm not even sure if that would be a function of the motherboard or the processor.

Eletriarnation fucked around with this message at 23:33 on May 2, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

1st_Panzer_Div. posted:

I'm about to hit a 4790k for $275, as the move to a 5820k would be about $600, and I can't possibly imagine the performance increase being worth the over double price tag, even if it's still on an H97, I can get some OC out of it, and it's a big jump from an i5 Haswell.

If it's on an H97 I'm pretty sure you actually can't get some OC out of it unless I missed something. The performance increase isn't going to be nearly 100% unless you have something that can scale smoothly to 12 threads and you overclock the 5820k but not the 4790k, but if you consider the whole system price instead of just proc/MB it is appealing for some use cases.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
Yeah, it's totally fine to stop at a normal i7-*k or even i5 for games because most don't scale well beyond 4 threads if that. Make sure you get a Z-series chipset though, not being able to overclock is just leaving performance on the table.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

LiquidRain posted:

Is a Z board a real requirement for the 4790k? "multi core turbo enhancement" is enough to push one of those to its limit.

It isn't as compelling as it would be with Skylake or Sandy Bridge, but if you care about the performance enough to pay $100 for an i7 with HT then it makes sense to pay substantially less for a Z chipset. At bare minimum you'll be able to peg the chip at its max turbo of 4.4, and it seems like typically people can get 4.6-4.7 stable from what I am seeing. Also, why are we considering Haswell instead of Skylake - did I miss that?

1st_Panzer_Div. posted:

So you literally can't OC an H board?

Correct.

Eletriarnation fucked around with this message at 00:42 on May 3, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

DrDork posted:

Just because a manufacture slaps "supported!" on the side of the box does not make it a good idea. A 2cm thick heatsink "supports" a 160W Xeon in the sense that "it won't let it thermally damage itself" but only because the chip is smart enough to throttle the gently caress out of itself instead of melting if it ever needed to clock up past idle. Something that small might be viable for a lower-TDP Xeon, like maybe one of the 80W E3's, that is stuck on file server/light VM duty and will probably never run full-tilt. But a 5820k, that you know he's going to want to overclock? Let's be real.

Eh, you'd be surprised what you can do with high airflow. I have a 1U server at work with two E5-2697 v3s in it - that's a 14-core, 145W Haswell-E processor - and this is a fully supported configuration, shipped from the manufacturer like that, with no indication that I'm seeing that it's throttling. You probably don't want to deal with the noise from airflow that high at home though.

Eletriarnation fucked around with this message at 21:17 on May 4, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
I feel like we also have to remember that the normal 6600 TDP is only 65W - the 6600K's 91W is already giving some extra headroom over what the processor should be using at stock settings. Allowing for an 80W rise over that means that you're expecting to OC a 65W processor to the point that it uses 170W, which seems frankly impossible unless you're using LN2 or some other novel form of cooling.

Personally I would go for the 6600K, because it's not a substantial premium and I think you'll end up buying an aftermarket cooler anyway with that case. The linked article reviewing the case notes that with Prime95 and Furmark (which is a rather unrealistic load) they only saw their i5-4690K get up to 55C at worst. That's a fine temperature with lots of headroom and I wouldn't expect the comparable Skylake chip at stock settings to be any worse, so I totally expect that you could at least do a multiplier-only overclock and evaluate how that affects your power draw and temperatures to see if tweaking voltage is an option. If you think that it's a bottleneck after you get the 1080 and you don't trust the power supply to let you put the voltage higher, then a better SFX supply is comparatively (to the 1080) cheap.

Eletriarnation fucked around with this message at 14:19 on May 18, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

HMS Boromir posted:

As someone who's never bought a laptop, I associate i3/i5/i7 with 2c/4t, 4c/4t, 4c/8t, and it's really confusing whenever I want to help a friend pick out a laptop because I have to figure out what they actually mean there. The one thing I think I've learnded is that high end quad cores are i7-####HQ. Is that consistent or are there exceptions?

Pentium? Core i5? Core i7? Making sense of Intel’s convoluted CPU lineup

In a laptop the i*-xxxxU (or M, for older generations) parts are 15W (35W for M) dual-cores with HT and turbo and i5/i7 just means it goes faster. There are i5-6xxxHQ parts now, which are 35-45W quads like the i7s but just slower. Generally with laptops you want to look at the letters instead of the number first to get an idea of what kind of part it is, then look at the number to get an idea of where it's ranked among parts of that type. Weird non-round numbers like something in the ones digit or i*-xx10 usually mean either upgraded Iris graphics or some other kind of niche feature.

Iris branding is its own little puzzle since some models just have more EUs in the IGP, some have a higher TDP allowed as well (see: 13" Macbook Pro processors), and some (Iris Pro or Crystal Well) have a L4 cache that can be used for graphics as needed.

Eletriarnation fucked around with this message at 17:24 on May 20, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

mediaphage posted:

Yes but you'll want an Ethernet drop

It's nice but shouldn't be necessary if your wireless is good enough. I've done streaming with AC wireless to an AP in the same room and it works great.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

mediaphage posted:

Sure though that is of course the absolute ideal scenario you understand.

That's true. It also worked fairly well with a 2.4GHz N access point about 20 feet away through drywall, but if you're trying to go through external walls or to a different floor you probably wouldn't want to count on it working in my experience. To be specific, this is with Macbook Pros and Ubiquiti APs.

Eletriarnation fucked around with this message at 01:20 on May 21, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
The biggest issue with putting a 1U server in your house is that they tend to be full of high RPM 40mm fans that are shrill and loud as hell. I found a spare Sandy Bridge-E one being thrown out at work and took it to my desk to tinker with, but I quickly realized that anyone sitting near me was not going to be happy with that idea.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Gwaihir posted:

1Us aren't going to be too great, but I have a gigantic pile of surplus R710s at work, and they have good BIOS fan control to stay pretty comfortably quiet. They're not on a normal desktop level of course, but they're a very long way from full throttle screaming fans non stop- They're reasonable enough to have one in your office at work if you needed to fuss with it, for example.

Yeah, I can believe that. The one I have is a Cisco UCS C220 and might be the same but it started out loud enough at my desk that I turned it off immediately. It doesn't sound all that loud racked up with other equipment but when the ambient noise level is over 70dB it's hard to tell really.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
How loud is it?

Adbot
ADBOT LOVES YOU

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

sout posted:

Cheers.
I've just been recommended to use a wristband in the past when messing with parts I guess :shrug:

The idea is that the wristband will ground out any static charge you might be carrying while you work so that it doesn't go through components that you're touching instead and damage them. In practice, unless you are somehow actively generating static while working on the computer it's sufficient to touch an unpainted bit of the case to ground yourself at the start before you touch anything else to work on it.

Eletriarnation fucked around with this message at 16:29 on Jun 6, 2016

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply