|
Confounding Factor posted:Here's my lovely build: Please do some cable management. I have the same case and that picture is making me sad.
|
![]() |
|
![]()
|
# ? Jun 15, 2024 19:45 |
|
k-uno posted:Maybe this isn't the most appropriate thread for this question, but out of the boutique system vendors which is the best regarded? I'm building a workstation PC for physics research (e.g. tons of parallel 64 bit floating point calculations) and I don't want to go with my university's preferred vendors (Apple and Dell) because the cost is just outrageous. I could probably convince IT to purchase a complete from somewhere like AVAdirect but buying all the individual parts from NewEgg might get turned down and not having a full system 3 year warranty would be a pain in the rear end. Gaming performance is irrelevant, but a good workstation graphics card with CUDA/OpenCL support is important because more and more applications are starting to natively support it. Puget Systems is pretty well regarded, you should check them out.
|
![]() |
The Wonder Weapon posted:Can you confirm your links are correct? The EVGA card you provide (http://www.newegg.com/Product/Product.aspx?Item=N82E16814487136) says in giant letters that a newer version of the card is available, and directs to the one I linked instead. Did they gently caress up the cooler on newer models? The one I linked is the newer model, look at the model numbers, 04G-P4-3973-KR is the one I linked with the ACX 2.0+ cooler, the one it incorrectly redirects to is 04G-P4-2974-KR, 2XXX models have the old ACX 2.0 cooler, 3XXX models have the ACX 2.0+. I dunno what the hell Newegg is doing there but they are redirecting people to an older model, not a newer one.
|
|
![]() |
|
Sort of on the topic of building systems, can anyone recommend an application that will CPU/GPU temperatures in my system tray or on the desktop? I would like to keep an eye on things until I decide if I am happy with my fan/air flow setup.
|
![]() |
|
AVeryLargeRadish posted:The one I linked is the newer model, look at the model numbers, 04G-P4-3973-KR is the one I linked with the ACX 2.0+ cooler, the one it incorrectly redirects to is 04G-P4-2974-KR, 2XXX models have the old ACX 2.0 cooler, 3XXX models have the ACX 2.0+. I dunno what the hell Newegg is doing there but they are redirecting people to an older model, not a newer one. Ah ok, gotcha. Do you think it's really worth getting a 970 over a 290? This XFX 290 (http://www.newegg.com/Product/Produ...ID=3938566&SID=) is a full $40 cheaper than the EVGA 970. How much more performance am I going to get out the 970 here? Will the card matter at 1920x1080p? Will the rest of my system bottleneck elsewhere, rendering the advantages the 370 may provide null? The difference between $275 and $315 isn't that large, but I also don't want to pay $40 for extra juice I'll never really use/see.
|
![]() |
|
The Wonder Weapon posted:Ah ok, gotcha.
|
![]() |
|
Niwrad posted:I'm buying RAM and was set on 8GB. Noticed it's $35 more to go to 16GB. Is there any value at all to going to 16GB? It seems like the consensus is that it'd be a waste, but I also remember people talking about how you wouldn't need more than 4GB a few years ago. Mainly just using the computer for Chrome, Office, and some strategy gaming (Civ 5, Football Manager). A few years ago I opted to grab 16gb and have never once regretted it. Especially since the prices are still higher now than they were for the same memory at the time. Memory tends to hold its value quite well so you can easily offload it if need be. It all depends on the amount of multitasking you do (I do quite a lot), but there has never been an instance where I've lamented having too much of anything in regards to specs.
|
![]() |
|
Confounding Factor posted:Hey guys, I have encountered an issue with my poverty build. CS:GO is a lovely game that has a bunch of 'optimizations' that try to make it work on simultaneously the worst and best computers breaking it for everyone in the middle. Try putting the following in your launch options: -novid -high -threads 4 And definitely add the following to your config: cl_forcepreload "1" That should fix things.
|
![]() |
The Wonder Weapon posted:Ah ok, gotcha. As Josh Lyman said, once you OC the 970 it will be solidly ahead of the 290, also the 970 has more stable performance, higher minimum frame rates and somewhat lower maximum frame rates which leads to the games feeling smoother overall.
|
|
![]() |
|
AVeryLargeRadish posted:As Josh Lyman said, once you OC the 970 it will be solidly ahead of the 290, also the 970 has more stable performance, higher minimum frame rates and somewhat lower maximum frame rates which leads to the games feeling smoother overall. What is involved in OCing the 970? If it's any more than "change a number in the config" then I'll probably avoid it, since I have neither the time nor interest required for enthusiast levels of hardware tweaking. e: If this video is an accurate representation, that isn't bad at all https://www.youtube.com/watch?v=VIzICd3mnc8 The Wonder Weapon fucked around with this message at 20:08 on Aug 16, 2015 |
![]() |
The Wonder Weapon posted:What is involved in OCing the 970? If it's any more than "change a number in the config" then I'll probably avoid it, since I have neither the time nor interest required for enthusiast levels of hardware tweaking. Yeah, it's that easy.
|
|
![]() |
|
Not sure if this is a bit too technical to ask here, but let's say I have an EVGA 1300w Supernova G2 powering 300w TDP worth of computer: The fan on this PSU does not have a silent mode at low temps like the 850w and below models have, it runs at a pretty loud full-blast. If I wanted to disable the fan, either by unsoldering it, soldering a potentiometer in-line with the fan wire so I can adjust speeds, or just using something like a zip-tie to physically hold the fan blades in place, which of those would be least risky in terms of overheating the PSU or permanently screwing up the fan or electronics? A gold PSU at 20% load is supposed to be 87% efficient, so 300 watts load suggests about 50 watts of waste heat, if that helps figure this out any.
|
![]() |
|
Zip ties sounds like a bad idea personally. Wouldn't the engine turning the fan keep going?
|
![]() |
|
Zero VGS posted:Not sure if this is a bit too technical to ask here, but let's say I have an EVGA 1300w Supernova G2 powering 300w TDP worth of computer: It's generally not a great idea to gently caress around with the cooling fan in a PSU. That said, I've totally opened a PSU, removed the stock fan and replaced it with a relatively quiet 120mm fan and then plugged the fan in to a spare motherboard header for speed control. I then proceeded to use that PSU for four years jury-rigged together (it helped that it was a really high quality Strider). Before anyone asks, I didn't have a single component failure in that time. So yes, it can be done. Probably not a great idea, but it can be done.
|
![]() |
|
Krailor posted:Puget Systems is pretty well regarded, you should check them out. I looked at them as well, but they're priced as high as Dell so it doesn't seem worth it. By the way, how much should a PSU's rated wattage exceed the expected system wattage at full load? Figure 140W for the i7 (I'm not planning to overclock, this machine needs to be reliable), 200W for the graphics card, and maybe 50W max for the RAM, motherboard and drives? Would a 650W power supply be adequate to ensure complete stability?
|
![]() |
|
You don't need to guess, Intel and Nvidia publish the maximum power draw figures for parts. For example, an i7 4790k is an 88W part, not 140. Add your CPU and GPU power draw, add 50w for the rest of your system and multiply by 1.25 for your minimum PSU rating. 650w is likely overkill.
|
![]() |
|
grack posted:You don't need to guess, Intel and Nvidia publish the maximum power draw figures for parts. For example, an i7 4790k is an 88W part, not 140. It's going to be an X99 board with an 8 core chip, so 140W TDP is actually the correct figure. But 1.25 as a multiplier sounds like a fairly comfortable window.
|
![]() |
|
This is for DDR3L, correct? THere's no way to reuse regular DDR3 ram is there?
|
![]() |
|
Specifically for Full ATX boards What is the reccomended Z170 chipset motherboard? It looks like they all float around $135-180 with the median being ~$165, and those include a single USB Type-C connector onboard ( ![]() Also there are a couple of Z170 boards that cost $280-320 and seem to have the exact same feature sets. I can't figure out why they cost so much more. I presume with all this VR goggle funny business that dual SLI video cards are going to finally be relevant so I'm going to want all the lanes that Z170 provides and two x 16x PCIE slots but I can't imagine why $164: ASUS Z170-A LGA 1151 Intel Z170 HDMI SATA 6Gb/s USB 3.1 USB 3.0 ATX Intel Motherboard - http://www.newegg.com/Product/Product.aspx?Item=N82E16813132566&cm_re=Z170-_-13-132-566-_-Product $319: ASUS Z170-DELUXE LGA 1151 Intel Z170 HDMI SATA 6Gb/s USB 3.1 USB 3.0 ATX Intel Motherboard - http://www.newegg.com/Product/Product.aspx?Item=N82E16813132568&cm_re=Z170-_-13-132-568-_-Product As far as I can tell, it has an extra M.2 slot, 2 extra SATA2 ports (for a total of 8 SATA), one E-SATA and a second Gig-E port.
|
![]() |
|
Ernie. posted:CS:GO is a lovely game that has a bunch of 'optimizations' that try to make it work on simultaneously the worst and best computers breaking it for everyone in the middle. Yeah I've done that already and followed the advice here. Nothing is working, and now I'm having the same issue when I play CS:Source. It just doesn't make any sense. I'm frustrated cause I went the cheap route and the two games I play the most aren't working out.
|
![]() |
|
I think I'll just take the 970, OC it a bit, and be done with it. One last quick question about matching RAM. I've got this installed currently: http://www.newegg.com/Product/Product.aspx?Item=N82E16820231550 What, if anything, do I need to keep in mind when adding to this? Any reason to pick 2x 4gb sticks over 1x 8gb stick other than future expansion options? Should I worry about timing, DDR3/4/5, etc? Or can I buy basically anything and slam it in and have it run no problem? For instance, would this work? http://www.newegg.com/Product/Produ...ID=3938566&SID= The Wonder Weapon fucked around with this message at 00:27 on Aug 17, 2015 |
![]() |
|
Wife needs a new video card. Her last card was a GTX 660 that bit the dust today. She wants high/ultra quality, but her monitor resolution is 1680 x 1050. She plays first-person RPGs like Skyrim and is stoked as hell for Fallout 4 so I don't wanna get her something that's lovely but that weird rear end resolution is throwing me off. Am I looking at a GTX 760/R9 270X type situation here? We're in Canada and would rather not get too spendy, $300 is the ceiling here. The parts picking guide is old and I wasn't sure if it is still accurate, plus I can't find much selection of some of the cards listed at NCIX. If it matters her CPU is an i5-3570K and she has an SSD and 16gb of RAM. Thanks helpful people.
|
![]() |
|
boneration posted:Wife needs a new video card. Her last card was a GTX 660 that bit the dust today. She wants high/ultra quality, but her monitor resolution is 1680 x 1050. She plays first-person RPGs like Skyrim and is stoked as hell for Fallout 4 so I don't wanna get her something that's lovely but that weird rear end resolution is throwing me off. Am I looking at a GTX 760/R9 270X type situation here? We're in Canada and would rather not get too spendy, $300 is the ceiling here. The parts picking guide is old and I wasn't sure if it is still accurate, plus I can't find much selection of some of the cards listed at NCIX. Grab a r280x for $304 after rebate.
|
![]() |
|
re: weird-rear end resolution 1680x1050 was a really popular standard for flatpanel displays 2007-2011 before 1080p prices dropped. Higher res than 1366x768 but also 16:10 for a more computer-y aspect ratio rather than the squashed-looking 16:9 of 1920x1080. For a while there it was looking like the PC world was going towars 1920x1200 on the high end and 1680x1050 on the low end, but instead we got 1920x1080 and 1366x768 ![]()
|
![]() |
|
The Iron Rose posted:Grab a r280x for $304 after rebate. Hadlock posted:re: weird-rear end resolution
|
![]() |
The Wonder Weapon posted:Is there a particular reason you're recommending the 280x over a 290, 290x, or 970? I've spent the last 48 hours scouring video cards, and just some posts up I settled on this, which is $310 (http://www.newegg.com/Product/Product.aspx?Item=N82E16814487136) I'm not arguing; just curious. Canadian vs American
|
|
![]() |
|
The Wonder Weapon posted:Is 1920x1080 more common? My Dell U2412M is 1920x1200. I thought all the decent monitors landed on x1200. 1080p is ubiquitous.
|
![]() |
|
Hadlock posted:Specifically for Full ATX boards Asus charges bogobucks for their halo tier motherboards because, no matter what, people will buy them. See also, X99-Deluxe being the top selling X99 board despite being insanely expensive and also having a known CPU killing bug in the VRM. Buy a motherboard for the features you actually need and will use, expensive boards aren't inherently "better".
|
![]() |
|
BurritoJustice posted:Asus charges bogobucks for their halo tier motherboards because, no matter what, people will buy them. See also, X99-Deluxe being the top selling X99 board despite being insanely expensive and also having a known CPU killing bug in the VRM. And every mobo maker is trying so hard to pretend that their ~$50 offerings don't exist, precisely for the very fact those are more than adequate for at least 90% of users even for gaming.
|
![]() |
|
The Wonder Weapon posted:I think I'll just take the 970, OC it a bit, and be done with it. Always do RAM in sets of matching pairs so you stay in dual channel mode.
|
![]() |
|
Does anyone know what the wattage of a 6600k or 6700k is supposed to be under load, assuming I'm using a discrete graphics card? I know the listed TDP is 95 watts but that's supposed to be with 100% CPU and 100% integrated graphics load at the same time... with a discrete graphics card and Prime95 or whatever, how much wattage does Skylake-K actually pull? edit: After some more Googling, I'm seeing 25 watts in idle and 65 watts under load for a 6600k. So, about the same as my overclocked G3258. Zero VGS fucked around with this message at 05:26 on Aug 17, 2015 |
![]() |
|
The Wonder Weapon posted:Is there a particular reason you're recommending the 280x over a 290, 290x, or 970? I've spent the last 48 hours scouring video cards, and just some posts up I settled on this, which is $310 (http://www.newegg.com/Product/Product.aspx?Item=N82E16814487136) I'm not arguing; just curious.
|
![]() |
|
i have two sticks of 4GB DDR3 1333 and I want to go to 16GB. I plan to upgrade the computer, including motherboard in a year. i figure the RAM will not be obsolete by then. I'm trying to decide - another 2x4 or go with 1x8? the former is supposedly better performance but the latter is more flexible later if i find myself wanting to upgrade the total again. 2x4 also seems harder to find where I am
|
![]() |
|
Verviticus posted:i have two sticks of 4GB DDR3 1333 and I want to go to 16GB. I plan to upgrade the computer, including motherboard in a year. i figure the RAM will not be obsolete by then. I'm trying to decide - another 2x4 or go with 1x8? the former is supposedly better performance but the latter is more flexible later if i find myself wanting to upgrade the total again. 2x4 also seems harder to find where I am The latest Intel architecture is using different DDR3 dimms and DDR4. You probably won't be able to move your regular DDR3 to a skylake (or kaby lake or whatever) in a year. RAM is pretty cheap now, though, so I'd just get 2x4 and not stress about it.
|
![]() |
|
Something I missed earlier - it appears that the current Skylake chips are NOT quad-channel, just dual... http://arstechnica.com/gadgets/2015/08/intel-skylake-core-i7-6700k-reviewed/ It's also important to note that, while Haswell-E and X99 supported quad-channel operation, Skylake and Sunrise Point will only do dual-channel. Presumably Skylake-E will again bring back a quad-channel integrated memory controller. So basically all those quad channel kits are basically useless and we're better off getting, say, 2x8 GB DDR4 instead of those 4x4 GB DDR4 kits that are all over the place? cr0y posted:Sort of on the topic of building systems, can anyone recommend an application that will CPU/GPU temperatures in my system tray or on the desktop? I would like to keep an eye on things until I decide if I am happy with my fan/air flow setup. It will even display all the info you want in a DirectX overlay on top of games. Verviticus posted:i have two sticks of 4GB DDR3 1333 and I want to go to 16GB. I plan to upgrade the computer, including motherboard in a year. i figure the RAM will not be obsolete by then. I'm trying to decide - another 2x4 or go with 1x8? the former is supposedly better performance but the latter is more flexible later if i find myself wanting to upgrade the total again. 2x4 also seems harder to find where I am Rexxed posted:The latest Intel architecture is using different DDR3 dimms and DDR4. You probably won't be able to move your regular DDR3 to a skylake (or kaby lake or whatever) in a year. RAM is pretty cheap now, though, so I'd just get 2x4 and not stress about it. In addition, can current DDR3 boards even accept an odd number of sticks without dropping out of Dual Channel mode? Assepoester fucked around with this message at 09:22 on Aug 17, 2015 |
![]() |
|
Quick question regarding the assembly of a set of computer parts I've already purchased. I'm trying to install an SSD in the one of the two "back mounts" of the Define R5 case. I've got it mounted just fine, the trouble is, I have no idea how to connect it or what to connect it to. All of the cables seem far too short to reach it and even if I dismount it, none of them appear to fit the two "plug slots" on the end of it either. I'm sure I'm missing something obvious. Anyone know offhand what it is?
|
![]() |
|
VulgarandStupid posted:Always do RAM in sets of matching pairs so you stay in dual channel mode. Ok. So I'd be getting 100% out of all my RAM if I added these (http://www.newegg.com/Product/Produ...ID=3938566&SID=) sticks to my existing (http://www.newegg.com/Product/Product.aspx?Item=N82E16820231550) sticks?
|
![]() |
|
Okay, dumb question maybe, and I think I know the answer, but just in case... I got the Hyper 212 Evo and the pointy bits on top touch the case. The bit of case it touches is a clear plastic window, not metal, if that makes a difference. The case bulges there by about 2mm. I measured everything before I bought it and I was sure it was going to fit. Is this going to cause problems?
|
![]() |
prussian advisor posted:Quick question regarding the assembly of a set of computer parts I've already purchased. The larger cable will be a SATA power cable coming off your power supply, the smaller one is a SATA data cable, your motherboard should have come with a couple of these, one end connects to the SSD, the other to a SATA port on your motherboard.
|
|
![]() |
|
![]()
|
# ? Jun 15, 2024 19:45 |
|
AVeryLargeRadish posted:The larger cable will be a SATA power cable coming off your power supply, the smaller one is a SATA data cable, your motherboard should have come with a couple of these, one end connects to the SSD, the other to a SATA port on your motherboard. At work now so can't test this out but this explains a lot, there's a couple cables that came in a separate bag with the motherboard, I wasn't sure what they were for and the minimalistic manual didn't really specify. I'll check this out when I get home, thank you.
|
![]() |