|
HalloKitty posted:Yay for awful thermal performance without delidding on Skylake!
|
# ? Aug 9, 2015 21:22 |
|
|
# ? Apr 20, 2024 12:25 |
HalloKitty posted:Yay for awful thermal performance without delidding on Skylake! Ehhh, it would also mean setting up another production line to serve a very small market, a market that they have already captured because AMD can't compete. Production lines are not cheap.
|
|
# ? Aug 9, 2015 21:26 |
|
I had seen some people say the smaller dies of the latest chips is also a reason?
|
# ? Aug 9, 2015 22:05 |
|
Don Lapre posted:I had seen some people say the smaller dies of the latest chips is also a reason?
|
# ? Aug 9, 2015 22:20 |
|
Managed to get the first Maximus VIII Gene on Amazon but now stuck without the 6700k. Almost gave in and got one from the UK for 80% markup, there's one on Amazon for the same from Israel but he only indicates in the shipping page that it will ship next month. Basically a scam at that point.
|
# ? Aug 9, 2015 22:51 |
|
HalloKitty posted:If it's such a cost issue as they always say, why was it so viable in the Sandy Bridge days, when Sandy Bridge cost less comparatively than the newer CPUs? Probably because when SB dropped people still at least remembered when AMD produced competitive products. AVeryLargeRadish posted:Ehhh, it would also mean setting up another production line to serve a very small market, a market that they have already captured because AMD can't compete. Production lines are not cheap. I guess that goes back to "how hard is it to simply not include the iGPU" when making the chip. Clearly the addition/subtraction of a few megs of L3 cache isn't a big deal, and what specific iGPU goes into chips varies quite a bit, as well, so there's already some precedent. It does seem silly, though; if you're going to make a "gamer oriented chip," why waste even $1 on an iGPU that your entire market demographic is going to disable the moment they get it? Then again, they did the same thing with Devil Canyon, so who knows.
|
# ? Aug 9, 2015 23:14 |
|
MaxxBot posted:What's wrong with it? Just curious, they seem to still be significantly above average as far as PC hardware review sites go. Ian's review kind of drops a lot of text in a way that makes me think he probably doesn't understand what he's describing as thoroughly as Anand did. It's also just not as well written. There's a lot of pretty awkward prose. The fact that he spent like two pages of the review talking about "IPC" without bothering to actually expand the acronym was annoying (he helpfully directs me to click on a link to a previous review where they measured IPC instead -- I can find the expansion buried there). I spent at least three confused seconds wondering why the gently caress there are a set of benchmarks specific to interprocess communication. He'll spend a paragraph where it seems like he's trying to explain very basic concepts like I'm five (e.g., caching), which is awkward because I have to mentally translate the babytalk ("oh, he's talking about caching"), but at the same time, in other parts of the review, he breezes past stuff that could stand to be elaborated on / explained more thoroughly. I haven't reread any of Anand's old reviews, so it may be my memory playing tricks on me, but I also recall Anand doing a better job of pulling it together at the end into a reasonable big-picture view of what it all means and who should care and why. I didn't really follow how he got from the meat of the review to the conclusion that I should upgrade my 2600K (which it doesn't seem like I should actually do). It's probably also true that I'd be less critical of the review if I were happier with the benchmark results!
|
# ? Aug 9, 2015 23:52 |
|
HalloKitty posted:If only they would offer a die with no GPU and the IHS soldered, it would no doubt cost less to produce overall, and be better in every way that matters to someone buying a top-end CPU for the desktop. The market is probably not big enough to justify creating a set of masks and test equipment just for this configuration. It might actually be counter-productive for enthusiasts to remove the GPU since when not in use it's basically a small on-die heatsink. Xeons allower higher max turbo frequencies if you disable cores because of this thermal effect.
|
# ? Aug 10, 2015 00:02 |
|
frunksock posted:I believe you that it is (above average relative to other reviews). I haven't read any in a long time. One thing I used to like about Anand's reviews was that he did a good job explaining the architecture's features to someone with roughly my level of understanding (basic nerd, but I don't work in silicon). It's something that's rare because it requires that the author actually understands it very thoroughly while also being a good writer. No...your mind is not playing tricks....Anand was great even going back to his days at CPU magazine. I spend most of my time since he slowed down/stopped and have moved to sorting through guru3d and overclockers shitpost's to find actual relevant information...and here of course...which actually seem's to be the best place.
|
# ? Aug 10, 2015 01:01 |
|
.
sincx fucked around with this message at 05:55 on Mar 23, 2021 |
# ? Aug 10, 2015 01:04 |
|
Generic Monk posted:does 'portable workstation' mean those 2 inch thick dell business monstrosities Not necessarily. For example, the HP Elitebook 840 G1 has the same chassis as the HP ZBook 14 'Mobile Workstation' (33.89 x 23.7 x 2.25 cm, ~1.6KG). The substantive differences seem to be a different LCD panel and AMD GPU on has FirePro firmware instead of Radeon firmware.
|
# ? Aug 10, 2015 01:44 |
|
So does the QUAD CHANNEL ability of Skylake show any advantage over plain old DUAL CHANNEL, such that it would make mATX or ATX builds with 4x4gb sticks (seemingly pretty much the only DDR4 memory anyone sells right now) an advantage over, say, a mITX build with 2x8gb?SpelledBackwards posted:That reminds me, what was the hot new PC tech magazine that popped up in the late '90s or early '00s and featured Anand and people along the lines of him or John Romero for guest columns? I definitely subscribed to that one for a while. Assepoester fucked around with this message at 04:44 on Aug 10, 2015 |
# ? Aug 10, 2015 04:35 |
|
Skylake non-LGA2011v5 is still dual-channel
|
# ? Aug 10, 2015 04:47 |
|
literally 3 posts up I commented on it >.> Publications Anand is also the author of the book AnandTech Guide to PC Gaming Hardware (ISBN 0-7897-2626-2) and has a regular column in CPU Magazine called Anand's Corner. Source: https://www.wikiwand.com/en/Anand_Lal_Shimpi
|
# ? Aug 10, 2015 04:55 |
|
I first thought about this one, mainly because every month would have Romero columns talking poo poo about everybody in it, then they gave her a column rebuttal, it was a very weird feeling being barely a teenager and having enough self awareness to realize "this should be really professionally inappropriate". Botnit fucked around with this message at 05:30 on Aug 10, 2015 |
# ? Aug 10, 2015 05:10 |
|
ACTUAL Game Designer
|
# ? Aug 10, 2015 05:24 |
|
Ffycchi posted:No...your mind is not playing tricks....Anand was great even going back to his days at CPU magazine. I spend most of my time since he slowed down/stopped and have moved to sorting through guru3d and overclockers shitpost's to find actual relevant information...and here of course...which actually seem's to be the best place. Ding ding ding, that's the mag, thanks. The name was staring me in the face the whole time. Computer Power User (CPU) Magazine. I was probably wrong about Romero (didn't know about PC Accelerator, ha). Not sure who I was else I was thinking of, though the Wikipedia article does mention Chris Pirillo. Edit: Does this mean now Stevie Case is going to make us her bitch? SpelledBackwards fucked around with this message at 05:33 on Aug 10, 2015 |
# ? Aug 10, 2015 05:29 |
|
DrDork posted:I guess that goes back to "how hard is it to simply not include the iGPU" when making the chip. Clearly the addition/subtraction of a few megs of L3 cache isn't a big deal, and what specific iGPU goes into chips varies quite a bit, as well, so there's already some precedent. It does seem silly, though; if you're going to make a "gamer oriented chip," why waste even $1 on an iGPU that your entire market demographic is going to disable the moment they get it? Then again, they did the same thing with Devil Canyon, so who knows. Would it be possible to eventually use the iGPU for something like physics in games (similar to how you can use a second graphics card for PhysX).
|
# ? Aug 10, 2015 05:42 |
|
Ragingsheep posted:Would it be possible to eventually use the iGPU for something like physics in games (similar to how you can use a second graphics card for PhysX). Sure, but that would involve NVidia sharing their PhysX IP, and I'm pretty sure you can guess how likely that is to happen. Closest thing to useful I've seen out of the Intel iGPUs is that they are available to some video compression platforms and can encode stuff impressively fast (though apparently almost always at lower quality than what CPU-pure encoding of the same video would produce).
|
# ? Aug 10, 2015 05:47 |
|
PC Accelerator was the poo poo if you were a teen and into computers. That poo poo knew how to laser target their demographic
|
# ? Aug 10, 2015 06:11 |
|
incoherent posted:Anandtech: "Sandy bridge your time is up" AT just went full retard on that one. SB is still way comfortably above the baseline of acceptable CPU gaming performance that telling existing SB users to sink $500+ into a 25% "faster" new platform that barely translates into any real world advantage is downright idiotic.
|
# ? Aug 10, 2015 06:30 |
|
Botnit posted:
Ah, computers and video games....where a 6 can feel like a 10
|
# ? Aug 10, 2015 06:59 |
|
Panty Saluter posted:Ah, computers and video games....where a 6 can feel like a 10
|
# ? Aug 10, 2015 07:12 |
|
Josh Lyman posted:To 13 to 29-year-old computer nerds, that cover is a honeypot. As sexual appeal, or because she wrote a guide for daikatana?
|
# ? Aug 10, 2015 08:02 |
|
I'm getting a diakatana iykwim. Also seems like it's better to spend money on something like a 980Ti now if you want to just increase your FPS in games.
|
# ? Aug 10, 2015 10:45 |
|
Ragingsheep posted:Would it be possible to eventually use the iGPU for something like physics in games (similar to how you can use a second graphics card for PhysX). DirectX 12 allows for dissimilar GPUs to be used together. One of the examples they give is using a discrete GPU to render and then your slower iGPU to do the post processing while the disctrete GPU renders the next frame. They got something like an extra 10% FPS, but it added nearly 2x the latency between frames.
|
# ? Aug 10, 2015 13:27 |
|
So I'm still on an i7 920. It's beginning to get a little bit long in the tooth, so I'm wondering what to do. Is it worth actually looking towards Skylake and DDR4 or is it just better to go with a 2nd-hand Haswell era chip and be done with it? I have 18GB total of triple channel RAM, half 10666 and half 12800 so I don't think I can really re-use that as it seems everyone went back to dual channels? I was sort of waiting on Skylake to drop and see how it's panning out now, but by the looks of it, it may just be easier to cycle a 2nd-hand Haswell than actually pay the latest-gen premium. I don't really see a benefit of early-adopting DDR4 because the performance seems to be a near-run and the LGA is just going to change so I can't recycle the mobo in a few years anyway.
|
# ? Aug 10, 2015 14:34 |
|
Nam Taf posted:So I'm still on an i7 920. It's beginning to get a little bit long in the tooth, so I'm wondering what to do. Is it worth actually looking towards Skylake and DDR4 or is it just better to go with a 2nd-hand Haswell era chip and be done with it? I have 18GB total of triple channel RAM, half 10666 and half 12800 so I don't think I can really re-use that as it seems everyone went back to dual channels? What's driving your upgrade, do you need more CPU power and have a highly threaded workload? If so, you want a 5820K and X99. Do you want a more modern chipset, onboard stuff, and peripherals? Either H97 or Skylake should meet your needs unless you are imminently expecting to buy a PCI-E SSD, in which case you want Skylake. If you just want single threaded CPU speed, then a 4790K is the way to go. If you've waited this long and don't mind waiting more, the non-overclocking Skylake chips look like they'll be more competitive overall. Not faster than a 4790K / 6700K, but with way better thermals, power consumption, and that price premium on platform should shrink when H170 launches.
|
# ? Aug 10, 2015 14:41 |
|
DrDork posted:I guess that goes back to "how hard is it to simply not include the iGPU" when making the chip. Clearly the addition/subtraction of a few megs of L3 cache isn't a big deal, and what specific iGPU goes into chips varies quite a bit, as well, so there's already some precedent. It does seem silly, though; if you're going to make a "gamer oriented chip," why waste even $1 on an iGPU that your entire market demographic is going to disable the moment they get it? Then again, they did the same thing with Devil Canyon, so who knows. For the desktop market, Intel really only fabs three chips, dual core mobile, quad core mobile, and 8 core enterprise. The desktop market just receives the "waste" of these production lines. This could mean some faulty cache, being too leaky, or having some cores being non-operational. The worse dual core mobile bins become the celerons, pentiums, and i3s. The worse quad core mobile bins become i5s and i7s. Lastly, the worse binned server chips become the HEDT chips. Therefore, in the end, it doesn't cost Intel anything to include the iGPU on the mainstream desktop chips because they are already there as a result of those chips being originally fabbed for mobile.
|
# ? Aug 10, 2015 17:10 |
|
.
sincx fucked around with this message at 05:55 on Mar 23, 2021 |
# ? Aug 10, 2015 17:34 |
|
I'm debating a Skylake upgrade. I'm on a 2500k @ 4.2ghz. It has served me quite well, and I know that switching to Skylake would be expensive due to needing a new motherboard and RAM as well. What I'm most interested in, and haven't seen much of, is benchmarks in single-threaded games. The reason I bring this up is the main game I play on my PC is Stalker: Call of Pripyat because I'm always actively modding it, and although it technically uses 2-cores, it's mostly 1-core. If Skylake doesn't improve from Sandy Bridge much on single-threaded performance I'm thinking I should at least wait until Skylake Refresh, when hopefully higher-clocks and/or 6-core processors are out.
|
# ? Aug 10, 2015 20:20 |
|
Swartz posted:I'm debating a Skylake upgrade. I'm on a 2500k @ 4.2ghz. It has served me quite well, and I know that switching to Skylake would be expensive due to needing a new motherboard and RAM as well. How badly do you want that single threaded performance? If you splash out for a 6700K, it should be about 15-25% faster single threaded than a 2500K @ 4.2. It's certainly not a huge leap.
|
# ? Aug 10, 2015 20:23 |
|
Twerk from Home posted:How badly do you want that single threaded performance? If you splash out for a 6700K, it should be about 15-25% faster single threaded than a 2500K @ 4.2. It's certainly not a huge leap. It's not imperative, but it would be very nice. The main issue with Stalker is that it's very CPU intensive and due to using just one core it tends to have lots of stuttering (and it can't be an I/O issue as I'm on a SSD, and it's not GPU as I have a GTX 970). 25% would be nice, if it had a minimum of that I'd upgrade, otherwise I think I'll wait until Skylake Refresh or maybe even whatever is after that, though I was hoping to build my new pc in mid-2016.
|
# ? Aug 10, 2015 20:26 |
|
Swartz posted:It's not imperative, but it would be very nice. The main issue with Stalker is that it's very CPU intensive and due to using just one core it tends to have lots of stuttering (and it can't be an I/O issue as I'm on a SSD, and it's not GPU as I have a GTX 970). There's another dark horse option: The i7-5775c. Techpowerup found it to outperform the 6700K on gaming workloads: http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/14 The issue is that how it performs varies greatly by game. If having 128MB of cache on package helps STALKER, the 5775c will deliver a much better experience than a 6700K. If it doesn't, then it's slower. Its one of those things that has to be tested to see.
|
# ? Aug 10, 2015 20:30 |
|
Nam Taf posted:So I'm still on an i7 920. It's beginning to get a little bit long in the tooth, so I'm wondering what to do. Is it worth actually looking towards Skylake and DDR4 or is it just better to go with a 2nd-hand Haswell era chip and be done with it? I have 18GB total of triple channel RAM, half 10666 and half 12800 so I don't think I can really re-use that as it seems everyone went back to dual channels? I am in a similar situation with a i7 930, and found out that buying a used hexacore xeon x5670 (any xeon 5600 series chip really) for $75 can work. I can retain my motherboard, ram and have a Sandy Bridge equivalent chip (with overclock) for a cheap upgrade to hold me out for a little longer. It is worth looking into. Does anyone else have experience with upgrading a nehalem Bloomfield to a xeon westmere-ep?
|
# ? Aug 10, 2015 20:34 |
|
Swartz posted:It's not imperative, but it would be very nice. The main issue with Stalker is that it's very CPU intensive and due to using just one core it tends to have lots of stuttering (and it can't be an I/O issue as I'm on a SSD, and it's not GPU as I have a GTX 970). It might be cheaper to buy a H110i GT and try to push your 2500K. It's possible to get faster single core speeds than even a 4790K.
|
# ? Aug 10, 2015 21:27 |
|
NarDmw posted:I am in a similar situation with a i7 930, and found out that buying a used hexacore xeon x5670 (any xeon 5600 series chip really) for $75 can work. I can retain my motherboard, ram and have a Sandy Bridge equivalent chip (with overclock) for a cheap upgrade to hold me out for a little longer. It is worth looking into. Does anyone else have experience with upgrading a nehalem Bloomfield to a xeon westmere-ep? Yeah, I did exactly what you did - tossed an old 920 and bought a X5670 Westmere Xeon, then threw a 212 Evo cooler on it and bumped it to 3.55ghz. It's still down a bit on per-core performance compared to my 2500k or 3570k machines, but I just wanted 12 threads to play with and only have a couple hundred bucks in the mobo/proc/HSF. The Westmere Xeons are on 32nm fab rather than the 45nm process of the 920, so power draw isn't much more than the quad-core.
|
# ? Aug 11, 2015 00:37 |
|
Twerk from Home posted:There's another dark horse option: The i7-5775c. Techpowerup found it to outperform the 6700K on gaming workloads: http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/14 Availability is an issue yes, but 5775C is criminally underrated. This is the only chip I would recommend a Z97 board since it has amazing performance for gaming at such low clocks and power draw.
|
# ? Aug 11, 2015 01:10 |
|
Palladium posted:Availability is an issue yes, but 5775C is criminally underrated. This is the only chip I would recommend a Z97 board since it has amazing performance for gaming at such low clocks and power draw. It's also more expensive than a 6700k by MSRP, which will likely hold true if availability is an issue. Also, most Z97 motherboards will require a BIOS update before they can take an i5-5775c. So that's another pain in the rear end, I don't think they can update their BIOS without an already compatible chip.
|
# ? Aug 11, 2015 02:04 |
|
|
# ? Apr 20, 2024 12:25 |
|
JnnyThndrs posted:Yeah, I did exactly what you did - tossed an old 920 and bought a X5670 Westmere Xeon, then threw a 212 Evo cooler on it and bumped it to 3.55ghz. It's still down a bit on per-core performance compared to my 2500k or 3570k machines, but I just wanted 12 threads to play with and only have a couple hundred bucks in the mobo/proc/HSF. Did you find that to be a useful/good value upgrade that allowed you to game at 1080p?
|
# ? Aug 11, 2015 03:01 |