Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
teagone
Jun 10, 2003

That was pretty intense, huh?

Confounding Factor posted:

Here's my lovely build:

A cable clusterfuck:



Please do some cable management. I have the same case and that picture is making me sad.

Adbot
ADBOT LOVES YOU

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

k-uno posted:

Maybe this isn't the most appropriate thread for this question, but out of the boutique system vendors which is the best regarded? I'm building a workstation PC for physics research (e.g. tons of parallel 64 bit floating point calculations) and I don't want to go with my university's preferred vendors (Apple and Dell) because the cost is just outrageous. I could probably convince IT to purchase a complete from somewhere like AVAdirect but buying all the individual parts from NewEgg might get turned down and not having a full system 3 year warranty would be a pain in the rear end. Gaming performance is irrelevant, but a good workstation graphics card with CUDA/OpenCL support is important because more and more applications are starting to natively support it.

With AVAdirect I can get an i7 5960X and FirePro W8100 (the most dp GFLOPS/$ out of workstation cards right now) with 32 GB of RAM for around $3500, which is 1/2 to 2/3 the cost for similar specs from Dell or Apple, but I don't know how well regarded they are. Does anyone have any experience with them? Or alternately, is there another vendor you'd recommend?

Puget Systems is pretty well regarded, you should check them out.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

The Wonder Weapon posted:

Can you confirm your links are correct? The EVGA card you provide (http://www.newegg.com/Product/Product.aspx?Item=N82E16814487136) says in giant letters that a newer version of the card is available, and directs to the one I linked instead. Did they gently caress up the cooler on newer models?

The one I linked is the newer model, look at the model numbers, 04G-P4-3973-KR is the one I linked with the ACX 2.0+ cooler, the one it incorrectly redirects to is 04G-P4-2974-KR, 2XXX models have the old ACX 2.0 cooler, 3XXX models have the ACX 2.0+. I dunno what the hell Newegg is doing there but they are redirecting people to an older model, not a newer one.

cr0y
Mar 24, 2005



Sort of on the topic of building systems, can anyone recommend an application that will CPU/GPU temperatures in my system tray or on the desktop? I would like to keep an eye on things until I decide if I am happy with my fan/air flow setup.

The Wonder Weapon
Dec 16, 2006



AVeryLargeRadish posted:

The one I linked is the newer model, look at the model numbers, 04G-P4-3973-KR is the one I linked with the ACX 2.0+ cooler, the one it incorrectly redirects to is 04G-P4-2974-KR, 2XXX models have the old ACX 2.0 cooler, 3XXX models have the ACX 2.0+. I dunno what the hell Newegg is doing there but they are redirecting people to an older model, not a newer one.

Ah ok, gotcha.

Do you think it's really worth getting a 970 over a 290? This XFX 290 (http://www.newegg.com/Product/Produ...ID=3938566&SID=) is a full $40 cheaper than the EVGA 970. How much more performance am I going to get out the 970 here? Will the card matter at 1920x1080p? Will the rest of my system bottleneck elsewhere, rendering the advantages the 370 may provide null? The difference between $275 and $315 isn't that large, but I also don't want to pay $40 for extra juice I'll never really use/see.

Josh Lyman
May 24, 2009


The Wonder Weapon posted:

Ah ok, gotcha.

Do you think it's really worth getting a 970 over a 290? This XFX 290 (http://www.newegg.com/Product/Produ...ID=3938566&SID=) is a full $40 cheaper than the EVGA 970. How much more performance am I going to get out the 970 here? Will the card matter at 1920x1080p? Will the rest of my system bottleneck elsewhere, rendering the advantages the 370 may provide null? The difference between $275 and $315 isn't that large, but I also don't want to pay $40 for extra juice I'll never really use/see.
I switched from a Gigabyte 290 to a Gigabyte 970 a couple weeks ago. There's basically no difference at stock speeds, but if you want to OC the 970, you can get a more noticeable difference, on the order of 18%.

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Niwrad posted:

I'm buying RAM and was set on 8GB. Noticed it's $35 more to go to 16GB. Is there any value at all to going to 16GB? It seems like the consensus is that it'd be a waste, but I also remember people talking about how you wouldn't need more than 4GB a few years ago. Mainly just using the computer for Chrome, Office, and some strategy gaming (Civ 5, Football Manager).

Just wondering if there would be any benefit to it, even if I don't see it for a couple years. Or if I'm just lighting $35 on fire.

A few years ago I opted to grab 16gb and have never once regretted it. Especially since the prices are still higher now than they were for the same memory at the time. Memory tends to hold its value quite well so you can easily offload it if need be.

It all depends on the amount of multitasking you do (I do quite a lot), but there has never been an instance where I've lamented having too much of anything in regards to specs.

Ernie.
Aug 31, 2012

Confounding Factor posted:

Hey guys, I have encountered an issue with my poverty build.

I did a little googling last night and apparently CS:GO takes advantage of CPUs that have atleast 3 cores. I bring this up because I am getting awful stuttering whenever I get into areas of the game where performance takes a hit (multiple people on screen, more objects, etc).

It's bizarre cause my refresh rate on the monitor I'm using is 85hz, but I'm getting a steady 200-120FPS on the lowest settings.

CS:GO is a lovely game that has a bunch of 'optimizations' that try to make it work on simultaneously the worst and best computers breaking it for everyone in the middle.

Try putting the following in your launch options:
-novid -high -threads 4
And definitely add the following to your config:
cl_forcepreload "1"

That should fix things.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

The Wonder Weapon posted:

Ah ok, gotcha.

Do you think it's really worth getting a 970 over a 290? This XFX 290 (http://www.newegg.com/Product/Produ...ID=3938566&SID=) is a full $40 cheaper than the EVGA 970. How much more performance am I going to get out the 970 here? Will the card matter at 1920x1080p? Will the rest of my system bottleneck elsewhere, rendering the advantages the 370 may provide null? The difference between $275 and $315 isn't that large, but I also don't want to pay $40 for extra juice I'll never really use/see.

As Josh Lyman said, once you OC the 970 it will be solidly ahead of the 290, also the 970 has more stable performance, higher minimum frame rates and somewhat lower maximum frame rates which leads to the games feeling smoother overall.

The Wonder Weapon
Dec 16, 2006



AVeryLargeRadish posted:

As Josh Lyman said, once you OC the 970 it will be solidly ahead of the 290, also the 970 has more stable performance, higher minimum frame rates and somewhat lower maximum frame rates which leads to the games feeling smoother overall.

What is involved in OCing the 970? If it's any more than "change a number in the config" then I'll probably avoid it, since I have neither the time nor interest required for enthusiast levels of hardware tweaking.

e: If this video is an accurate representation, that isn't bad at all
https://www.youtube.com/watch?v=VIzICd3mnc8

The Wonder Weapon fucked around with this message at 20:08 on Aug 16, 2015

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

The Wonder Weapon posted:

What is involved in OCing the 970? If it's any more than "change a number in the config" then I'll probably avoid it, since I have neither the time nor interest required for enthusiast levels of hardware tweaking.

e: If this video is an accurate representation, that isn't bad at all
https://www.youtube.com/watch?v=VIzICd3mnc8

Yeah, it's that easy.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Not sure if this is a bit too technical to ask here, but let's say I have an EVGA 1300w Supernova G2 powering 300w TDP worth of computer:

The fan on this PSU does not have a silent mode at low temps like the 850w and below models have, it runs at a pretty loud full-blast. If I wanted to disable the fan, either by unsoldering it, soldering a potentiometer in-line with the fan wire so I can adjust speeds, or just using something like a zip-tie to physically hold the fan blades in place, which of those would be least risky in terms of overheating the PSU or permanently screwing up the fan or electronics?

A gold PSU at 20% load is supposed to be 87% efficient, so 300 watts load suggests about 50 watts of waste heat, if that helps figure this out any.

strong bird
May 12, 2009

Zip ties sounds like a bad idea personally. Wouldn't the engine turning the fan keep going?

grack
Jan 10, 2012

COACH TOTORO SAY REFEREE CAN BANISH WHISTLE TO LAND OF WIND AND GHOSTS!

Zero VGS posted:

Not sure if this is a bit too technical to ask here, but let's say I have an EVGA 1300w Supernova G2 powering 300w TDP worth of computer:

The fan on this PSU does not have a silent mode at low temps like the 850w and below models have, it runs at a pretty loud full-blast. If I wanted to disable the fan, either by unsoldering it, soldering a potentiometer in-line with the fan wire so I can adjust speeds, or just using something like a zip-tie to physically hold the fan blades in place, which of those would be least risky in terms of overheating the PSU or permanently screwing up the fan or electronics?

A gold PSU at 20% load is supposed to be 87% efficient, so 300 watts load suggests about 50 watts of waste heat, if that helps figure this out any.

It's generally not a great idea to gently caress around with the cooling fan in a PSU.

That said, I've totally opened a PSU, removed the stock fan and replaced it with a relatively quiet 120mm fan and then plugged the fan in to a spare motherboard header for speed control. I then proceeded to use that PSU for four years jury-rigged together (it helped that it was a really high quality Strider). Before anyone asks, I didn't have a single component failure in that time.

So yes, it can be done. Probably not a great idea, but it can be done.

k-uno
Jun 20, 2004

Krailor posted:

Puget Systems is pretty well regarded, you should check them out.

I looked at them as well, but they're priced as high as Dell so it doesn't seem worth it.

By the way, how much should a PSU's rated wattage exceed the expected system wattage at full load? Figure 140W for the i7 (I'm not planning to overclock, this machine needs to be reliable), 200W for the graphics card, and maybe 50W max for the RAM, motherboard and drives? Would a 650W power supply be adequate to ensure complete stability?

grack
Jan 10, 2012

COACH TOTORO SAY REFEREE CAN BANISH WHISTLE TO LAND OF WIND AND GHOSTS!
You don't need to guess, Intel and Nvidia publish the maximum power draw figures for parts. For example, an i7 4790k is an 88W part, not 140.

Add your CPU and GPU power draw, add 50w for the rest of your system and multiply by 1.25 for your minimum PSU rating. 650w is likely overkill.

k-uno
Jun 20, 2004

grack posted:

You don't need to guess, Intel and Nvidia publish the maximum power draw figures for parts. For example, an i7 4790k is an 88W part, not 140.

Add your CPU and GPU power draw, add 50w for the rest of your system and multiply by 1.25 for your minimum PSU rating. 650w is likely overkill.

It's going to be an X99 board with an 8 core chip, so 140W TDP is actually the correct figure. But 1.25 as a multiplier sounds like a fairly comfortable window.

The Sweet Hereafter
Jan 11, 2010

This is for DDR3L, correct? THere's no way to reuse regular DDR3 ram is there?

Hadlock
Nov 9, 2004

Specifically for Full ATX boards

What is the reccomended Z170 chipset motherboard? It looks like they all float around $135-180 with the median being ~$165, and those include a single USB Type-C connector onboard (:dance:)

Also there are a couple of Z170 boards that cost $280-320 and seem to have the exact same feature sets. I can't figure out why they cost so much more.

I presume with all this VR goggle funny business that dual SLI video cards are going to finally be relevant so I'm going to want all the lanes that Z170 provides and two x 16x PCIE slots but I can't imagine why

$164: ASUS Z170-A LGA 1151 Intel Z170 HDMI SATA 6Gb/s USB 3.1 USB 3.0 ATX Intel Motherboard - http://www.newegg.com/Product/Product.aspx?Item=N82E16813132566&cm_re=Z170-_-13-132-566-_-Product

$319: ASUS Z170-DELUXE LGA 1151 Intel Z170 HDMI SATA 6Gb/s USB 3.1 USB 3.0 ATX Intel Motherboard - http://www.newegg.com/Product/Product.aspx?Item=N82E16813132568&cm_re=Z170-_-13-132-568-_-Product

As far as I can tell, it has an extra M.2 slot, 2 extra SATA2 ports (for a total of 8 SATA), one E-SATA and a second Gig-E port.

Confounding Factor
Jul 4, 2012

by FactsAreUseless

Ernie. posted:

CS:GO is a lovely game that has a bunch of 'optimizations' that try to make it work on simultaneously the worst and best computers breaking it for everyone in the middle.

Try putting the following in your launch options:
-novid -high -threads 4
And definitely add the following to your config:
cl_forcepreload "1"

That should fix things.

Yeah I've done that already and followed the advice here.

Nothing is working, and now I'm having the same issue when I play CS:Source. It just doesn't make any sense. I'm frustrated cause I went the cheap route and the two games I play the most aren't working out.

The Wonder Weapon
Dec 16, 2006



I think I'll just take the 970, OC it a bit, and be done with it.

One last quick question about matching RAM. I've got this installed currently: http://www.newegg.com/Product/Product.aspx?Item=N82E16820231550 What, if anything, do I need to keep in mind when adding to this? Any reason to pick 2x 4gb sticks over 1x 8gb stick other than future expansion options? Should I worry about timing, DDR3/4/5, etc? Or can I buy basically anything and slam it in and have it run no problem? For instance, would this work? http://www.newegg.com/Product/Produ...ID=3938566&SID=

The Wonder Weapon fucked around with this message at 00:27 on Aug 17, 2015

boneration
Jan 9, 2005

now that's performance
Wife needs a new video card. Her last card was a GTX 660 that bit the dust today. She wants high/ultra quality, but her monitor resolution is 1680 x 1050. She plays first-person RPGs like Skyrim and is stoked as hell for Fallout 4 so I don't wanna get her something that's lovely but that weird rear end resolution is throwing me off. Am I looking at a GTX 760/R9 270X type situation here? We're in Canada and would rather not get too spendy, $300 is the ceiling here. The parts picking guide is old and I wasn't sure if it is still accurate, plus I can't find much selection of some of the cards listed at NCIX.

If it matters her CPU is an i5-3570K and she has an SSD and 16gb of RAM. Thanks helpful people.

The Iron Rose
May 12, 2012

:minnie: Cat Army :minnie:

boneration posted:

Wife needs a new video card. Her last card was a GTX 660 that bit the dust today. She wants high/ultra quality, but her monitor resolution is 1680 x 1050. She plays first-person RPGs like Skyrim and is stoked as hell for Fallout 4 so I don't wanna get her something that's lovely but that weird rear end resolution is throwing me off. Am I looking at a GTX 760/R9 270X type situation here? We're in Canada and would rather not get too spendy, $300 is the ceiling here. The parts picking guide is old and I wasn't sure if it is still accurate, plus I can't find much selection of some of the cards listed at NCIX.

If it matters her CPU is an i5-3570K and she has an SSD and 16gb of RAM. Thanks helpful people.

Grab a r280x for $304 after rebate.

Hadlock
Nov 9, 2004

re: weird-rear end resolution

1680x1050 was a really popular standard for flatpanel displays 2007-2011 before 1080p prices dropped. Higher res than 1366x768 but also 16:10 for a more computer-y aspect ratio rather than the squashed-looking 16:9 of 1920x1080. For a while there it was looking like the PC world was going towars 1920x1200 on the high end and 1680x1050 on the low end, but instead we got 1920x1080 and 1366x768 :(

The Wonder Weapon
Dec 16, 2006



The Iron Rose posted:

Grab a r280x for $304 after rebate.
Is there a particular reason you're recommending the 280x over a 290, 290x, or 970? I've spent the last 48 hours scouring video cards, and just some posts up I settled on this, which is $310 (http://www.newegg.com/Product/Product.aspx?Item=N82E16814487136) I'm not arguing; just curious.

Hadlock posted:

re: weird-rear end resolution

1680x1050 was a really popular standard for flatpanel displays 2007-2011 before 1080p prices dropped. Higher res than 1366x768 but also 16:10 for a more computer-y aspect ratio rather than the squashed-looking 16:9 of 1920x1080. For a while there it was looking like the PC world was going towars 1920x1200 on the high end and 1680x1050 on the low end, but instead we got 1920x1080 and 1366x768 :(
Is 1920x1080 more common? My Dell U2412M is 1920x1200. I thought all the decent monitors landed on x1200.

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.

The Wonder Weapon posted:

Is there a particular reason you're recommending the 280x over a 290, 290x, or 970? I've spent the last 48 hours scouring video cards, and just some posts up I settled on this, which is $310 (http://www.newegg.com/Product/Product.aspx?Item=N82E16814487136) I'm not arguing; just curious.


Canadian vs American

teagone
Jun 10, 2003

That was pretty intense, huh?

The Wonder Weapon posted:

Is 1920x1080 more common? My Dell U2412M is 1920x1200. I thought all the decent monitors landed on x1200.

1080p is ubiquitous.

BurritoJustice
Oct 9, 2012

Hadlock posted:

Specifically for Full ATX boards

What is the reccomended Z170 chipset motherboard? It looks like they all float around $135-180 with the median being ~$165, and those include a single USB Type-C connector onboard (:dance:)

Also there are a couple of Z170 boards that cost $280-320 and seem to have the exact same feature sets. I can't figure out why they cost so much more.

I presume with all this VR goggle funny business that dual SLI video cards are going to finally be relevant so I'm going to want all the lanes that Z170 provides and two x 16x PCIE slots but I can't imagine why

$164: ASUS Z170-A LGA 1151 Intel Z170 HDMI SATA 6Gb/s USB 3.1 USB 3.0 ATX Intel Motherboard - http://www.newegg.com/Product/Product.aspx?Item=N82E16813132566&cm_re=Z170-_-13-132-566-_-Product

$319: ASUS Z170-DELUXE LGA 1151 Intel Z170 HDMI SATA 6Gb/s USB 3.1 USB 3.0 ATX Intel Motherboard - http://www.newegg.com/Product/Product.aspx?Item=N82E16813132568&cm_re=Z170-_-13-132-568-_-Product

As far as I can tell, it has an extra M.2 slot, 2 extra SATA2 ports (for a total of 8 SATA), one E-SATA and a second Gig-E port.

Asus charges bogobucks for their halo tier motherboards because, no matter what, people will buy them. See also, X99-Deluxe being the top selling X99 board despite being insanely expensive and also having a known CPU killing bug in the VRM.

Buy a motherboard for the features you actually need and will use, expensive boards aren't inherently "better".

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

BurritoJustice posted:

Asus charges bogobucks for their halo tier motherboards because, no matter what, people will buy them. See also, X99-Deluxe being the top selling X99 board despite being insanely expensive and also having a known CPU killing bug in the VRM.

Buy a motherboard for the features you actually need and will use, expensive boards aren't inherently "better".

And every mobo maker is trying so hard to pretend that their ~$50 offerings don't exist, precisely for the very fact those are more than adequate for at least 90% of users even for gaming.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




The Wonder Weapon posted:

I think I'll just take the 970, OC it a bit, and be done with it.

One last quick question about matching RAM. I've got this installed currently: http://www.newegg.com/Product/Product.aspx?Item=N82E16820231550 What, if anything, do I need to keep in mind when adding to this? Any reason to pick 2x 4gb sticks over 1x 8gb stick other than future expansion options? Should I worry about timing, DDR3/4/5, etc? Or can I buy basically anything and slam it in and have it run no problem? For instance, would this work? http://www.newegg.com/Product/Produ...ID=3938566&SID=



Always do RAM in sets of matching pairs so you stay in dual channel mode.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Does anyone know what the wattage of a 6600k or 6700k is supposed to be under load, assuming I'm using a discrete graphics card? I know the listed TDP is 95 watts but that's supposed to be with 100% CPU and 100% integrated graphics load at the same time... with a discrete graphics card and Prime95 or whatever, how much wattage does Skylake-K actually pull?

edit: After some more Googling, I'm seeing 25 watts in idle and 65 watts under load for a 6600k. So, about the same as my overclocked G3258.

Zero VGS fucked around with this message at 05:26 on Aug 17, 2015

Scalding Coffee
Jun 26, 2006

You're already dead

The Wonder Weapon posted:

Is there a particular reason you're recommending the 280x over a 290, 290x, or 970? I've spent the last 48 hours scouring video cards, and just some posts up I settled on this, which is $310 (http://www.newegg.com/Product/Product.aspx?Item=N82E16814487136) I'm not arguing; just curious.

Is 1920x1080 more common? My Dell U2412M is 1920x1200. I thought all the decent monitors landed on x1200.
It is easier to market consoles on smaller resolutions.

Verviticus
Mar 13, 2006

I'm just a total piece of shit and I'm not sure why I keep posting on this site. Christ, I have spent years with idiots giving me bad advice about online dating and haven't noticed that the thread I'm in selects for people that can't talk to people worth a damn.
i have two sticks of 4GB DDR3 1333 and I want to go to 16GB. I plan to upgrade the computer, including motherboard in a year. i figure the RAM will not be obsolete by then. I'm trying to decide - another 2x4 or go with 1x8? the former is supposedly better performance but the latter is more flexible later if i find myself wanting to upgrade the total again. 2x4 also seems harder to find where I am

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

Verviticus posted:

i have two sticks of 4GB DDR3 1333 and I want to go to 16GB. I plan to upgrade the computer, including motherboard in a year. i figure the RAM will not be obsolete by then. I'm trying to decide - another 2x4 or go with 1x8? the former is supposedly better performance but the latter is more flexible later if i find myself wanting to upgrade the total again. 2x4 also seems harder to find where I am

The latest Intel architecture is using different DDR3 dimms and DDR4. You probably won't be able to move your regular DDR3 to a skylake (or kaby lake or whatever) in a year. RAM is pretty cheap now, though, so I'd just get 2x4 and not stress about it.

Assepoester
Jul 18, 2004
Probation
Can't post for 10 years!
Melman v2
Something I missed earlier - it appears that the current Skylake chips are NOT quad-channel, just dual...

http://arstechnica.com/gadgets/2015/08/intel-skylake-core-i7-6700k-reviewed/

It's also important to note that, while Haswell-E and X99 supported quad-channel operation, Skylake and Sunrise Point will only do dual-channel. Presumably Skylake-E will again bring back a quad-channel integrated memory controller.

So basically all those quad channel kits are basically useless and we're better off getting, say, 2x8 GB DDR4 instead of those 4x4 GB DDR4 kits that are all over the place?





cr0y posted:

Sort of on the topic of building systems, can anyone recommend an application that will CPU/GPU temperatures in my system tray or on the desktop? I would like to keep an eye on things until I decide if I am happy with my fan/air flow setup.
MSI Afterburner is fine for that. http://www.guru3d.com/files-details/msi-afterburner-beta-download.html

It will even display all the info you want in a DirectX overlay on top of games.




Verviticus posted:

i have two sticks of 4GB DDR3 1333 and I want to go to 16GB. I plan to upgrade the computer, including motherboard in a year. i figure the RAM will not be obsolete by then. I'm trying to decide - another 2x4 or go with 1x8? the former is supposedly better performance but the latter is more flexible later if i find myself wanting to upgrade the total again. 2x4 also seems harder to find where I am

Rexxed posted:

The latest Intel architecture is using different DDR3 dimms and DDR4. You probably won't be able to move your regular DDR3 to a skylake (or kaby lake or whatever) in a year. RAM is pretty cheap now, though, so I'd just get 2x4 and not stress about it.
Yeah you can't use regular DDR3 sticks with any current or upcoming Skylake boards, so unless by upgrade you mean to stick with Devil's Canyon or earlier... yeah.

In addition, can current DDR3 boards even accept an odd number of sticks without dropping out of Dual Channel mode?

Assepoester fucked around with this message at 09:22 on Aug 17, 2015

prussian advisor
Jan 15, 2007

The day you see a camera come into our courtroom, its going to roll over my dead body.
Quick question regarding the assembly of a set of computer parts I've already purchased.

I'm trying to install an SSD in the one of the two "back mounts" of the Define R5 case. I've got it mounted just fine, the trouble is, I have no idea how to connect it or what to connect it to. All of the cables seem far too short to reach it and even if I dismount it, none of them appear to fit the two "plug slots" on the end of it either. I'm sure I'm missing something obvious. Anyone know offhand what it is?

The Wonder Weapon
Dec 16, 2006



VulgarandStupid posted:

Always do RAM in sets of matching pairs so you stay in dual channel mode.

Ok. So I'd be getting 100% out of all my RAM if I added these (http://www.newegg.com/Product/Produ...ID=3938566&SID=) sticks to my existing (http://www.newegg.com/Product/Product.aspx?Item=N82E16820231550) sticks?

Konsek
Sep 4, 2006

Slippery Tilde
Okay, dumb question maybe, and I think I know the answer, but just in case...

I got the Hyper 212 Evo and the pointy bits on top touch the case. The bit of case it touches is a clear plastic window, not metal, if that makes a difference. The case bulges there by about 2mm. I measured everything before I bought it and I was sure it was going to fit. Is this going to cause problems?

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

prussian advisor posted:

Quick question regarding the assembly of a set of computer parts I've already purchased.

I'm trying to install an SSD in the one of the two "back mounts" of the Define R5 case. I've got it mounted just fine, the trouble is, I have no idea how to connect it or what to connect it to. All of the cables seem far too short to reach it and even if I dismount it, none of them appear to fit the two "plug slots" on the end of it either. I'm sure I'm missing something obvious. Anyone know offhand what it is?

The larger cable will be a SATA power cable coming off your power supply, the smaller one is a SATA data cable, your motherboard should have come with a couple of these, one end connects to the SSD, the other to a SATA port on your motherboard.

Adbot
ADBOT LOVES YOU

prussian advisor
Jan 15, 2007

The day you see a camera come into our courtroom, its going to roll over my dead body.

AVeryLargeRadish posted:

The larger cable will be a SATA power cable coming off your power supply, the smaller one is a SATA data cable, your motherboard should have come with a couple of these, one end connects to the SSD, the other to a SATA port on your motherboard.

At work now so can't test this out but this explains a lot, there's a couple cables that came in a separate bag with the motherboard, I wasn't sure what they were for and the minimalistic manual didn't really specify. I'll check this out when I get home, thank you.

  • Locked thread