|
MachinTrucChose posted:Overclocking is a stupid waste of money and shouldn't be done Boy, I sure feel stupid overclocking my 2.33GHz CPU to 3.4GHz for no cost other than a modest, quiet cooler that was maybe $50 Clocked a 2.33 core 2 quad to 3GHz on the stock cooler, too, so that cost a grand total of $0 HalloKitty fucked around with this message at 14:30 on Nov 5, 2010 |
# ¿ Nov 5, 2010 14:28 |
|
|
# ¿ Apr 26, 2024 16:37 |
|
Oh, I don't know. A Q6600 with a healthy overclock is probably still on par with Bulldozer for IPC
|
# ¿ Oct 22, 2011 14:47 |
|
Predictable and expected, but 2700K is here. http://www.anandtech.com/show/5009/intel-releases-core-i72700k-and-cuts-the-prices-of-three-CPUs No price cuts of real interest (low end stuff).
|
# ¿ Oct 25, 2011 00:50 |
|
Not all that impressed, really. A binned 8 core Xeon with two cores disabled? The thing is ridiculously huge. No USB3 support, no smart response caching, no quicksync (although honestly, unless x264 had QuickSync support, who cares about this?). Seems extremely unpolished. Yes, it's the absolute fastest. But really, you'd have to be a fool to do anything but get a 2500k and overclock it right now. Ivy Bridge will be worth the money, Sandy Bridge E, not so much. I'd liked it if they had the 12MiB and 15MiB versions of the chip in the review, so you could see where your HUGE LUMP of extra cash went, because lets be honest, it didn't go into 100MHz on the base and turbo clocks. Although, no more push pins. Yay! HalloKitty fucked around with this message at 12:33 on Nov 14, 2011 |
# ¿ Nov 14, 2011 12:31 |
|
freeforumuser posted:Ivy Bridge SKUs leaked: This post summed it up nicely: "Looks like the raw performance will probably not be much greater than SB. But the thermals will be pretty significant. And thanks to AMDs incompetence. Intel has no incentive to bring out high clocked IB." I'm pretty sure Intel could have clocked some higher, but they were shooting for lower TDPs. The interesting wildcard here is overclocking. I'd imagine they'll overclock better. Reviews are going to be interesting.
|
# ¿ Nov 30, 2011 22:21 |
|
Agreed posted:They've gotta be at least binning for the 2700/2600. Like, I basically feel like what happened there is the chip lottery for higher clocking 2600Ks got more difficult when they introduced the 2700K and now there's a price premium for chips that might clock higher a little easier. :/ For all you know, they could be binned identically. I mean really, has anyone ever had a problem overclocking their 2600K by 100MHz? I can't even imagine it could ever happen, since the chip is obviously designed to scale up to its turbo frequency.. and that's only 100MHz more on the 2700K too.
|
# ¿ Dec 2, 2011 21:19 |
|
Gwaihir posted:Good thing nearly all laptops have piece of poo poo 1366 x 768 displays these days I actually seriously want a 15" Macbook Pro because you can get it with a 1680x1050 anti-glare display. Obviously I can't afford one, but I support the fact that Apple offers a myriad of 16:10 options. 13.3"? 1440x900. Delightful stuff.
|
# ¿ Dec 8, 2011 20:49 |
|
mobby_6kl posted:Look at this scrub with his low-res screen. I've been posting from a T520 with the 1080p screen and SSD for about a week I've had a laptop for years that has a 1920x1200 screen. It's the rest of the hardware that's archaic
|
# ¿ Dec 14, 2011 23:35 |
|
incoherent posted:Seriously. Free performance on the table. The (nice) boards will even do all the heavy lifting for you. Wait, there was heavy lifting involved? Maybe if you want to go for 4.5+
|
# ¿ Jan 16, 2012 11:05 |
|
mayodreams posted:He was just saying the better boards (I just got an Asus P68Z68-V) literally do push button overclocking on Sandy Bridge. I got the Cooler Master Hyper 212 and clicked two things in the BIOS and have been running stable at 4.4GHz for about 3 weeks now. It was the easiest overclocking experience I've ever had. I have the same board (P8Z68-V Pro). Maybe we're just spoilt. I was basically just trying to highlight how easy it is to do a small overclock on Sandy Bridge. Christ, I even found it very easy on earlier Core 2 platforms as well.
|
# ¿ Jan 16, 2012 14:49 |
|
Wedesdo posted:You aren't going to be able to overclock. Are you okay with that? He said earlier in the thread that he didn't give the tiniest poo poo about overclocking, so I guess that's not a concern.
|
# ¿ Jan 18, 2012 11:09 |
|
Honestly, what would be interesting is if they simply had a high success rate at 5GHz on reasonably quiet air with reasonable voltage. Which I imagine is possible, seeing as its an improvement all round on SB on a smaller process. But it'd still be nice to see.
|
# ¿ Feb 29, 2012 23:58 |
|
Wedesdo posted:http://www.nordichardware.com/news/69-cpu-chipset/45720-ivy-bridge-gets-95w-tdp-worse-overclocker-than-sandy-bridge.html As a correction/addition to this: http://www.nordichardware.com/news/69-cpu-chipset/45738-ivy-bridge-sells-with-95w-tdp-but-uses-a-maximum-of-77w.html Still, if the absolute dismal overclocking performance is true, then Ivy Bridge must be considered a CPU to pass on. To be fair though, Intel, you already gave us the gift of 2500K, so all is forgiven for a while. HalloKitty fucked around with this message at 14:15 on Apr 18, 2012 |
# ¿ Apr 18, 2012 14:11 |
|
Here's the all important quote:AnandTech posted:My recommendation – if you run an overclocked Sandy Bridge system now, do not jump to Ivy Bridge. You may be severely disappointed by the overclocking performance. So yeah, people aren't going to be upgrading from Sandy Bridge. The main benefit of Ivy Bridge is actually mundane in a way - it should provide a much higher baseline for integrated graphics performance, which will make picking out a laptop in future more enjoyable, as you won't always feel like you have to keep searching for one with a dGPU. vv There's no doubt it was necessary for all these new things to come together. I'll be most interested to see laptops based on Ivy Bridge HalloKitty fucked around with this message at 18:56 on Apr 23, 2012 |
# ¿ Apr 23, 2012 18:38 |
|
Badmana posted:I convert several movies/tv shows a week to watch on my 5" streak Have you tried MX Video Player? I haven't got a Streak, but give it a go. Transcoding seems like such awful nonsense, and I hate that devices don't ship with more codecs with hardware acceleration. I have a piece of crap Android tablet that cost me next to nothing, yet it decodes and plays even 1080p MKV just fine (although scaled down to 800x480). All thanks to a little smart software engineering, something the big names don't seem to get. HalloKitty fucked around with this message at 12:18 on Apr 25, 2012 |
# ¿ Apr 25, 2012 10:07 |
|
"Ivy Bridge's improved IPC is almost entirely mitigated by its reduced overclocking headroom." http://www.anandtech.com/show/5787/ibuy-power-erebus-gt-review-ivy-bridge-and-nvidias-geforce-gtx-680-in-sli/2 Also, I'm pleased by the 990x's showing. I maintained a long time ago that although the 980x was a ridiculously expensive CPU, it's surprisingly held its own. That is still true. It doesn't look like bad value now compared to buying Nehalem + Sandy Bridge + Ivy Bridge. vv Oh, I agree that it's still a ridiculous cost, I don't have one. But compared to some of the retarded "extreme" things we've seen in the past, it's not that bad. Compare it to, say, Pentium 4 Extreme, Pentium Extreme Edition, or really most SLI setups, especially back in the day, I knew I had a guy who had 7800GTX SLI (256MB cards) and, well, he always had overheating issues, it never performed that well, and it cost a crazy amount and was extremely soon eclipsed. Then again, come to think of it, he also had VapoChill, and removing his CPU caused the brittle pins to shatter. HalloKitty fucked around with this message at 09:22 on Apr 27, 2012 |
# ¿ Apr 27, 2012 08:47 |
|
Badmana posted:I just downloaded it. Nice piece of software (I like the ease of forwarding and reversing) but my Streak only has a 800 x 480 screen. I've always thought I should down scale a movie to 800 x 480 to both save space (sd card only supports FAT32, no +4 gig files) and to reduce stutter. Does MX player scale down without trouble? It should scale down just fine, but whether it runs smoothly, you'll just have to copy a file to it and see what happens.
|
# ¿ Apr 27, 2012 15:31 |
|
Factory Factory posted:Ugh, watch it not switch to soldering until IVB-E. Probably. What the hell is Intel playing at? If all Ivy Bridge chips really are covered with TIM then a heatspreader, they're blue-balling the poo poo out of everyone in an extremely cynical move. I don't see how it could do anything other than severely impact cooling performance. Whether removing it really translates into better overclocking.. well.. I'm hoping we see a deluge of de-lidded Ivy Bridge overclocking articles soon.
|
# ¿ Apr 27, 2012 23:11 |
|
Sandy Bridge-E hexa-core is still the highest end platform (especially if you need a crazy amount of RAM), but you just need to know whether it'll be worth the money.. depends what you do, really, but I doubt you'd be limited by a regular Ivy Bridge chip for a lot less. http://www.anandtech.com/bench/Product/443?vs=551 HalloKitty fucked around with this message at 16:52 on Jun 1, 2012 |
# ¿ Jun 1, 2012 16:48 |
|
Zhentar posted:And of course, it only does you any good if you can actually make use of six cores. The big draw is the 8 RAM slots - if you need it, it's one of those things vv In that case you're not the person Sandy Bridge-E is for. Definitely just get yourself a 3570k or a 3770k and a nice Z77 board HalloKitty fucked around with this message at 19:34 on Jun 1, 2012 |
# ¿ Jun 1, 2012 17:44 |
|
hobbesmaster posted:Doesn't seem unreasonable that each tick-tock set would have a new socket. Isn't that tock-tick? (new-shrink)
|
# ¿ Jul 2, 2012 13:13 |
|
Toast Museum posted:That has always seemed completely backwards to me. I agree. I don't know why it's that way.
|
# ¿ Jul 3, 2012 11:51 |
|
Agreed posted:that you don't need RAM faster than DDR3 1333mhz. When I built my Sandy Bridge 2500K system, I could swear the sweet spot was 1600.. Of course, if you start talking about using integrated graphics or AMD Fusion then RAM speed matters more. Edit: Yeah, it was an AnandTech article: http://www.anandtech.com/show/4503/sandy-bridge-memory-scaling-choosing-the-best-ddr3/8 vvv Their wording: "The sweet spot appears to be at DDR3-1600". Not that it matters much really, since 1600 is cheap anyway. But the message of staying away from the "EXTREME" stuff is, as usual, a relevant one. HalloKitty fucked around with this message at 23:08 on Aug 4, 2012 |
# ¿ Aug 4, 2012 22:54 |
|
pienipple posted:I ran Ubuntu for a long time with a 30GB partition for / and it never got more than half full, but if you're running Linux you probably don't need the creme de la creme motherboard. And if you're buying that motherboard you're probably expecting to be doing a lot of gaming and therefore running Windows primarily. Huh, dual booting to a Linux install would be a pretty good use for it.
|
# ¿ Aug 15, 2012 20:36 |
|
mayodreams posted:This is a really interesting look at CPU performance in gaming. I am surprised there is that much difference in latency between Intel and AMD. It was not far fetched to imagine Bulldozer wouldn't beat Sandy Bridge back then, but the thing nobody expected is that it would be worse than their own previous products. It really was a poor show.
|
# ¿ Aug 24, 2012 18:00 |
|
Alereon posted:Here's the article at TechReport. It's kind of depressing to see that my four year old Core 2 Quad Q9650 is still better than any AMD processor at gaming. Yorkfield was a beast. I had a Q9550 which I ran all the time at 3.8GHz. Infact, the only reason I upgraded to an 2500K is because I had some hard to pin down problem, which I think was possibly to do with the motherboard. That chip had a monstrous level 2 cache compared to even the stuff you get now.
|
# ¿ Aug 24, 2012 19:05 |
|
I'd rather use all the on-die cache as cache for all other tasks and buy the most ridiculous discrete GPU possible, really. (I don't now - 6950, I'm just voicing a viewpoint).
|
# ¿ Sep 14, 2012 22:19 |
|
Agreed posted:Great GPGPU performance a few generations from now would really make the whole idea of "GPGPU" as a separate category weird, at that point we'd need a different name for it, maybe just go back to referring to them as coprocessors or adopt a more generalized language like "APU" in the legitimate sense of the word. I know you have a PhysX drum to beat, and I have an vv Sorry, I meant "Why would Intel decide to accelerate PhysX?" HalloKitty fucked around with this message at 22:59 on Sep 14, 2012 |
# ¿ Sep 14, 2012 22:27 |
|
Surely DirectCompute and OpenCL can provide? I guess you mean a physics-specific framework based open one
|
# ¿ Sep 14, 2012 23:15 |
|
Alereon posted:I don't know how you can see benchmarks of AMD FX-series processors and say they're "good enough." Yeah, obviously if you're doing something that doesn't depend on the CPU you won't notice much, but that's also true of a five year old machine. Start playing Skyrim and you'll notice framerates being cut in half in high-action sections, while even last-generation Intel processors continue to run at 60fps. Granted Piledriver can almost catch up on workloads that can use all 8 cores, but how common is being able to use four cores, much less 8 on desktop workloads? Keep in mind that these huge performance gaps exist on today's workloads, tomorrow's games that don't have to keep CPU usage low enough to run on a console is going to see a much larger difference. Skyrim is known for being CPU heavy, others will fare better, but only because they skew more to GPU performance. The ultimate point is there are definite scenarios in which you will notice the difference. HalloKitty fucked around with this message at 10:35 on Nov 27, 2012 |
# ¿ Nov 27, 2012 10:29 |
|
teh z0rg posted:Overclocking isn't about realistic improvement. It's about raw numbers that go up when you clock. The person with the highest numbers wins. Eh, not true. Some things do benefit massively from overclocking. Anything that burns maximum CPU for some time, like video encoding..
|
# ¿ Jan 24, 2013 18:17 |
|
spasticColon posted:Will it be worth upgrading to Haswell from a i5-2500K chip for gaming? I would only have to upgrade the CPU and motherboard and keep everything else right? I'm pretty certain a 2500k with a healthy overclock is not going to be out of date any time soon. However, once the new consoles arrive, we can expect things to move in terms of requirements as a knock on effect.
|
# ¿ Mar 24, 2013 14:57 |
|
hobbesmaster posted:13" rMBP can have a 35W tdp processor in it. Thats probably about as close as you'll get. I have a Sony Vaio Z12, 13.1" that has a 35w tdp cpu (i5-520m) AND a Geforce GT 330m in a similar size/weight, so it's possible to squeeze quite a bit in.
|
# ¿ Mar 28, 2013 08:49 |
|
Palladium posted:Pre 2000 era PCs were already slow for their time because most of them were installed with too little RAM to start with. This was usually the biggest problem, even some way into the 2000's.. I remember XP desktops shipping with 128MiB RAM, then being loaded with an image covered in garbage such as corporate anti virus and so on. Basically we had an era of machines that were expected to swap all day long to their slow rear end IDE drives, and nobody thought that was a problem (other than the users). Eventually things picked up, and we had even the cheapest XP machines shipping with 512.. Then Vista hit, and the cheapest machines still shipped with 512, and it was 2001 all over again. Swap city!
|
# ¿ Apr 14, 2013 11:14 |
|
Shaocaholica posted:Might still take a generation for people to stop delidding Haha, let them. I wonder if anyone tried to delid their Sandy Bridge? That would be hilarious
|
# ¿ Apr 30, 2013 19:49 |
|
Alereon posted:There was buzz yesterday that Intel is considering an acquisition of AMD to form a unified front against ARM, but that's probably just analysts being retarded. I can't imagine AMD management or shareholders going for that given they are about to start making a SHITTON of money from consoles for 5+ years. Also, I thought the reason AMD had an x86 licence was so they could provide an alternate source for the CPUs. Intel and AMD merging would ring all available monopoly bells
|
# ¿ May 2, 2013 21:14 |
|
Factory Factory posted:Yeah, it began trickling out soon after that it wasn't necessarily that no PSU was ready for that load, just that nobody had really tested a load that low and many units would probably work. Aw yeah. I have an X-660. For my own systems, and for people who are willing to spend the money, I'd always recommend Seasonic X and Platinum series. But I think the main reason they are going to work is that they are DC-DC for the 3.3 and 5v rails, so there's always some load on the 12v rail.
|
# ¿ May 11, 2013 16:02 |
|
Also, talking about OS X lacking features, the irony is that people think it's for graphics designers... yet it doesn't support 10-bit colour displays.
|
# ¿ May 12, 2013 17:26 |
|
Martello posted:I use my computer primarily for gaming, entertainment, and writing. I've been planning on upgrading my CPU from the Sandy Bridge i7 I have now. Is it worth waiting for Haswell or should I just grab an Ivy Bridge right now? Getting a new motherboard would also be a big pain in the rear end. What Sandy Bridge? If it's a 2500K, 2600K or 2700K, you will see almost no benefit from upgrading to Ivy Bridge. Sandy Bridge overclocks better, and Ivy Bridge doesn't offer any significant performance increases for your usage scenarios. Edit: scratch that, you say i7. If that's a 2600K or 2700K, you'd be totally wasting your time and money
|
# ¿ May 28, 2013 12:29 |
|
|
# ¿ Apr 26, 2024 16:37 |
|
movax posted:Yeah, you shouldn't be thinking about upgrading for a long while, unless you absolutely need every single last drop of performance / have a lot of money to burn. Tried to find some info on 3770K vs 2700K overclocked. At least looking at that test, if you can overclock the 2700K 200MHz more than the 3770K, the gains of Ivy Bridge evaporate. That's not too much to ask of a 2700K, either Also, the gains are almost totally meaningless in gaming at a resolution you might play at. There's a good article on AnandTech: http://www.anandtech.com/show/6934/choosing-a-gaming-cpu-single-multigpu-at-1440p HalloKitty fucked around with this message at 16:18 on May 28, 2013 |
# ¿ May 28, 2013 16:12 |