|
Factory Factory posted:Z also allows PCIe lane bifurcation for SLI and CrossFire graphics configurations (or really, for two or three high-performance PCIe devices regardless of use). H boards could be hacked the same way with a PCIe bridge, but that's more expensive than just using a Z in the first place. That reminds me... will single-GPU setup suffer from performance degradation if I put it in a PCIe 3.0 x8 slot instead of an x16 slot? I'm driving a 1080p monitor with it. Also, are modern SSDs likely to saturate the connection in a PCIe 2.0 2x slot, or are they not limited by connection?
|
# ? Jul 26, 2014 20:30 |
|
|
# ? Apr 25, 2024 07:25 |
|
atomicthumbs posted:That reminds me... will single-GPU setup suffer from performance degradation if I put it in a PCIe 3.0 x8 slot instead of an x16 slot? I'm driving a 1080p monitor with it. Someone can confirm this but I believe PCIe 3.0 x8 bandwidth is equal to the bandwidth of PCIe 2.0 x16 (though ultimately it depends on how the lane was electrically wired). And I don't think PCIe 2.0 x16 was very limiting to single card situations. There was some concern that R9 290X cards in Crossfire might be limited by PCIe 3.0 when the slots are operating as x8 but I'm not sure if that bore out (and obviously in this instance this doesn't matter). Canned Sunshine fucked around with this message at 20:45 on Jul 26, 2014 |
# ? Jul 26, 2014 20:41 |
|
Sidesaddle Cavalry posted:This, though what was their reasoning for making all those [insert badass name here]-Z boards for Z68 again? Was it just to have new stuff for the initial launch of Ivy Bridge? I remember Z77 came out really quickly afterwards. Z68 instead of P67, back when "can OC but can't use iGPU" was somehow seen as a meaningful segment differentiator. (Yes, I'm aware there are other differences, but that was the primary one, and pretty much everyone who didn't already buy a P67 board just went with a Z68 for OCing).
|
# ? Jul 26, 2014 21:05 |
|
atomicthumbs posted:That reminds me... will single-GPU setup suffer from performance degradation if I put it in a PCIe 3.0 x8 slot instead of an x16 slot? I'm driving a 1080p monitor with it. quote:Also, are modern SSDs likely to saturate the connection in a PCIe 2.0 2x slot, or are they not limited by connection?
|
# ? Jul 26, 2014 21:27 |
|
SourKraut posted:While I don't think performance is going to suffer, why not just use the x16 slot? I'm not sure what you mean by 'how it was wired' but yes, PCIe 3.0 x8 is about the same as PCIe 2.0 x16. PCIe 2.0 uses 5GT/s line rate with 8b10b line coding, giving 5*(8/10) = 4Gbps per lane. 3.0 is 8GT/s with 128b130b encoding, or 7.88Gbps per lane.
|
# ? Jul 27, 2014 02:03 |
|
GokieKS posted:Z68 instead of P67, back when "can OC but can't use iGPU" was somehow seen as a meaningful segment differentiator. That solves my mystery, and makes me wonder why Intel never bothered to develop at least a lovely not-on-die-GPU to integrate onto X-series chipsets for utility purposes
|
# ? Jul 27, 2014 02:48 |
|
BobHoward posted:I'm not sure what you mean by 'how it was wired'
|
# ? Jul 27, 2014 05:50 |
|
atomicthumbs posted:That reminds me... will single-GPU setup suffer from performance degradation if I put it in a PCIe 3.0 x8 slot instead of an x16 slot? I'm driving a 1080p monitor with it. Well, I've got a GTX 780Ti on a P67 motherboard, fourth card to pull main rendering duty (GTX 580-->680-->780-->780Ti), and it has scaled pretty much 1:1 with expected results - that is, comparing benchmarks etc. with other people using overpriced graphics cards - and my graphics performance in general shows no difference if I pull my PhysX card (ahahaha), on PCI-e 2.0 8x (16x without physics coprocessor ). I'm fairly confident you'll be fine at PCI-e 3.0 8x if I'm fine at PCI-e 2.0 8x.
|
# ? Jul 27, 2014 07:28 |
|
SourKraut posted:While I don't think performance is going to suffer, why not just use the x16 slot? I think if I put an SSD in the second long slot on a VII Gene it'd knock the first one down from x16 to x8. I'm trying to figure out how much that'd hurt performance-wise. edit: and the reason I'd be doing that is complicated and involves m.2, miniPCIe, and a video capture card atomicthumbs fucked around with this message at 10:39 on Jul 27, 2014 |
# ? Jul 27, 2014 10:36 |
|
Basically not at all. For some reason I can't Google up literally any of the review articles on this, but even the most beefy of single-GPU video cards lose barely 1% on average dropping from PCIe 3.0 x16 all the way to x4 (or PCIe 2.0 x8). At 1080p you likely won't even see a full 1% drop, as it shows up more at higher resolutions like 2560x1440.
|
# ? Jul 27, 2014 12:00 |
|
For most game-like loads, the performance hit will be nearly nothing so long as the GPU has enough local memory to hold all the textures and static vertex data. GPU command traffic doesn't need much bandwidth. If you're doing something GPGPU-ish, that kind of thing usually depends more on PCIe bandwidth.
|
# ? Jul 27, 2014 13:08 |
|
Factory Factory posted:Basically not at all. For some reason I can't Google up literally any of the review articles on this, but even the most beefy of single-GPU video cards lose barely 1% on average dropping from PCIe 3.0 x16 all the way to x4 (or PCIe 2.0 x8). http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/
|
# ? Jul 27, 2014 18:23 |
|
Agreed posted:Well, I've got a GTX 780Ti on a P67 motherboard, fourth card to pull main rendering duty (GTX 580-->680-->780-->780Ti), and it has scaled pretty much 1:1 with expected results - that is, comparing benchmarks etc. with other people using overpriced graphics cards - and my graphics performance in general shows no difference if I pull my PhysX card (ahahaha), on PCI-e 2.0 8x (16x without physics coprocessor ). Is it worth having a physx card? edit: as in, is it even worth the wattage? I doubt I'm SLI'ing anytime soon so I can throw down on some $50 craigslist gpu, but I know very little about the practical benefits of a physx card. The annoying part is the power consumption of old cards edit2: well, I found a gtx 750 for $50, so I guess I'll be doing it anyways 1gnoirents fucked around with this message at 15:18 on Jul 28, 2014 |
# ? Jul 28, 2014 15:10 |
|
1gnoirents posted:Is it worth having a physx card?
|
# ? Jul 28, 2014 15:58 |
|
Is it even worth... 50 watts? If I'm better off buying a steak and beer instead I'm content with that but if there is any reason at all
|
# ? Jul 28, 2014 16:39 |
|
1gnoirents posted:Is it even worth... 50 watts? If I'm better off buying a steak and beer instead I'm content with that but if there is any reason at all Buy a steak and beer. Any card that draws only 50 watts isn't going to improve your performance at all, and the beefy cards just aren't worth it for how few games really use PhysX. You're better off putting that money into getting a more powerful primary card.
|
# ? Jul 28, 2014 17:17 |
|
KillHour posted:Buy a steak and beer. Any card that draws only 50 watts isn't going to improve your performance at all, and the beefy cards just aren't worth it for how few games really use PhysX. You're better off putting that money into getting a more powerful primary card. I have a 780ti (for 1440p) and the card I saw is a $50 gtx 750. But, its looking like steak and beer more and more (just realized this wasnt the gpu thread, sorry)
|
# ? Jul 28, 2014 17:23 |
|
KillHour posted:Buy a steak and beer. Any card that draws only 50 watts isn't going to improve your performance at all, and the beefy cards just aren't worth it for how few games really use PhysX. You're better off putting that money into getting a more powerful primary card. For what it's worth, the 750 is able to handle PhysX pretty well on it's own.
|
# ? Jul 28, 2014 17:23 |
|
1gnoirents posted:Is it even worth... 50 watts? If I'm better off buying a steak and beer instead I'm content with that but if there is any reason at all I have never once, with my gtx680, or now with my 780ti encountered a situation where I wished I had a separate physX card. barely anything uses it, and when it does, you just turn it on and your fps is perfectly fine.
|
# ? Jul 28, 2014 17:26 |
|
I've seen the light, thanks lol.
|
# ? Jul 28, 2014 17:28 |
|
The Lord Bude posted:I have never once, with my gtx680, or now with my 780ti encountered a situation where I wished I had a separate physX card. barely anything uses it, and when it does, you just turn it on and your fps is perfectly fine. Honestly, NVIDIA is screwing themselves over by refusing to work with AMD cards, because I bet they'd sell a few 750s to people who already have a high end Radeon. I don't actually get the logic. It's a nice to have, but if you already got a sweet deal on a 290 or something, you're not going to spend a bunch getting a 780 instead for a locked-in feature, but you might think about the aforementioned used $50 750.
|
# ? Jul 28, 2014 17:29 |
|
1gnoirents posted:Is it even worth... 50 watts? If I'm better off buying a steak and beer instead I'm content with that but if there is any reason at all In games that are optimized for physx it looks like you could get around a 20% performance increase so it might make sense if: 1. You play a lot (i mean a LOT) of games that are optimized for physx and 2. You love playing with advanced physx and 3. You aren't happy with your current framerates Only real article I could find on it: http://alienbabeltech.com/main/using-maxwells-gtx-750-ti-dedicated-physx-card/
|
# ? Jul 28, 2014 17:37 |
|
It impacts minimum framerates more than maximum framerates because there's some difficulty in the load balancing/role switching. Said role-switching and the impact it had on FPS was worse with Fermi going from memory, and also SMXes seem to just be plain I previously had a headless GTX 580 for CUDA stuff and that's what lead to me caring about physics co-processors at all. I personally think that it's an antiquated idea and by its very nature it's graphics not gameplay, and I look forward to non-proprietary GPGPU physics engines becoming more prevalent as the console generation matures and they start taking advantage of the rather substantial compute power (... for a console!), and that'll ... trickle up? I hope? I personally think that the Bullet physics engine is a good GPGPU engine, open source, but regardless: no, it is not worth it to have a PhysX card unless you play a shitload of PhysX games, which you deductively cannot because you don't need very many hands to count the good games that use PhysX. Also, that god damned pepper shaker in Alice Returns is still going to tank your FPS (probably).
|
# ? Jul 28, 2014 18:50 |
|
Probably better to sell your existing card and buy a higher end card with the money you would spend on a physx card.
|
# ? Jul 28, 2014 19:23 |
|
^^Hmm, I'd typically agree. However in my situation I can't really do that. After looking at the games that use it and actually finding more than 1 I play(ed) I guess I don't have a lot to lose since I can just sell it. Perhaps the UT4 engine will use physx like UT3. Thanks everyone, and sorry again for the derail
|
# ? Jul 28, 2014 20:15 |
|
1gnoirents posted:^^Hmm, I'd typically agree. However in my situation I can't really do that. After looking at the games that use it and actually finding more than 1 I play(ed) I guess I don't have a lot to lose since I can just sell it. Perhaps the UT4 engine will use physx like UT3. Thanks everyone, and sorry again for the derail Pros: Borderlands games Batman games (is this still a pro? ... that was a pretty wicked lookin' batmobile in the trailer...) uhhh Mafia II, Alice Returns, and the two Metro games Cons: You bought a kind of expensive luxury item for a really silly reason Dummy This hits close to home
|
# ? Jul 29, 2014 02:32 |
|
Delidding i7-5960X
|
# ? Jul 29, 2014 17:32 |
|
Welmu posted:Delidding i7-5960X Thatll buff right out.
|
# ? Jul 29, 2014 17:37 |
|
Don Lapre posted:Thatll buff right out. It should line back up when you reseat the CPU in the socket.
|
# ? Jul 29, 2014 17:40 |
|
|
# ? Jul 29, 2014 17:51 |
|
Welmu posted:Delidding i7-5960X I GIS'd this image and if nothing else it was just a sample and not something someone bought Also it seems Intel learned from the issues they had with Haswell and the lids before, so... http://www.kitguru.net/components/cpu/anton-shilov/intel-core-i7-5960x-haswell-e-de-lidded-interesting-discoveries-found/
|
# ? Jul 29, 2014 23:20 |
|
Couldn't Intel have given them a heads-up about the soldering? Although it may just be done for website hits, it is a striking image after these years of non-soldered heatspreaders.
|
# ? Jul 29, 2014 23:28 |
|
Has anybody tried to solder a cpu themselves ?
|
# ? Jul 30, 2014 04:00 |
|
1gnoirents posted:Has anybody tried to solder a cpu themselves ? No. Lots of people have floated the idea, but after you pass out of the Dunning-Kruger zone doing research you find out that low-temperature solder that connects aluminum with glass and ceramics is very difficult to find and execute. Why glass and ceramics, you ask? That's what the outer layers of a CPU are made of, silicon glass and ceramic protective layers over the gooey silicon-and-metal-bits core.
|
# ? Jul 30, 2014 04:50 |
|
If one of those CPU's have disabled cores is it possible to enable them?
|
# ? Jul 30, 2014 11:02 |
|
Generally, no. Intel especially will lock off unused parts by cutting links with a laser. When and if this isn't done, as was the case with many of AMD's Athlon II and Phenom II era CPUs and a couple of their GPUs (like the Radeon 6950), then theoretically you can activate the locked-off parts using custom firmware or a BIOS/UEFI that exposes such controls. However, the parts are usually locked off for a good reason and such unlocks often fail.
|
# ? Jul 30, 2014 11:06 |
|
Factory Factory posted:Generally, no. Intel especially will lock off unused parts by cutting links with a laser. or a pencil
|
# ? Jul 30, 2014 11:12 |
|
A 12 core CPU would be pretty awesome though (I like programming with concurrency)
|
# ? Jul 30, 2014 12:43 |
|
Lord Windy posted:A 12 core CPU would be pretty awesome though Most people who buy an i7 would rather have 8 and the higher clock speed, and those who need many more cores can achieve that with extremely expensive Xeons. (Although those are the ridiculous 8-way socket models, it was more fun to pick those out). HalloKitty fucked around with this message at 13:19 on Jul 30, 2014 |
# ? Jul 30, 2014 13:09 |
|
|
# ? Apr 25, 2024 07:25 |
|
HalloKitty posted:So buy one. That's what Xeon workstations are for. Of course, it's Ivy not Haswell, but I imagine Haswell Xeons will follow soon. Oh I spot price AWS CPU servers whenever I want to get my kicks in with heaps of CPUs. I'm honestly pretty happy with my 4-core, 8 thread CPU. They will probably release a 12 core version of that with reduced clock speed that I won't buy because I simply don't need it for every day computing.
|
# ? Jul 30, 2014 13:24 |