|
movax posted:I believe Xeon part numbering is supposed to be:
|
# ? May 15, 2012 17:01 |
|
|
# ? Apr 25, 2024 22:15 |
|
Bob Morales posted:Oh good, that's public! I was sure I had seen it on Engadget or something.
|
# ? May 15, 2012 17:04 |
|
I miss when it was just Celeron, Pentium II and Pentium II Xeon
|
# ? May 15, 2012 17:09 |
|
Anandtech has a review up of the Asus UX21A Zenbook Prime (11.6" Ivy Bridge Ultrabook): http://www.anandtech.com/show/5843/asus-zenbook-prime-ux21a-reviewquote:With the exception of the SSD and Windows 7's unfortunate lack of elegant DPI scaling, the Zenbook Prime is the epitome of Ultrabook perfection. It has all of the build quality that we loved about the original Zenbook, with almost none of the quirks.
|
# ? May 22, 2012 20:26 |
|
Cicero posted:Anandtech has a review up of the Asus UX21A Zenbook Prime (11.6" Ivy Bridge Ultrabook): http://www.anandtech.com/show/5843/asus-zenbook-prime-ux21a-review 1920x1080 on an 11.6"?? WHY?
|
# ? May 22, 2012 20:33 |
|
Bob Morales posted:1920x1080 on an 11.6"?? WHY? Why wouldn't you want that, as long as the GPU can keep up?
|
# ? May 22, 2012 20:49 |
|
Bob Morales posted:1920x1080 on an 11.6"?? WHY? Not to mention High DPI assets look like sex. High resolution doesn't have to mean small text, you just get the same sized text with things that look better (iPhone 3GS vs iPhone 4 or iPad2 vs new iPad)
|
# ? May 22, 2012 20:51 |
|
Bob Morales posted:1920x1080 on an 11.6"?? WHY? Yeah, W7 and earlier suck on high DPI displays (and I'm not sure W8 is better on the desktop). I know Apple is driving towards high DPI displays (and this is a wonderful development), but the OS has to properly support it. Releasing a 189dpi display on Windows, today, is crazy. The 13.3" version is extremely tempting though. Now to convince my boss I need a new laptop...
|
# ? May 22, 2012 20:51 |
|
I'm seriously looking at the $799 UX32A. Sure you give up having an SSD, and the screen is mediocre instead of awesome, but the price!
|
# ? May 22, 2012 22:50 |
|
I wonder how much of a game performance boost the NVIDIA 620M will give the UX32VD-DB71. I really like the 11.6" form factor as I want to maximize portability after owning my 15.6" E1505 for the past 6 years, but I would be willing to jump up to 13.3 if there was a significant increase in the play-ability of games. I'm not really looking for a gaming machine necessarily, but it seems like it would be dumb to overlook.
|
# ? May 23, 2012 15:03 |
|
Install Gentoo posted:Why wouldn't you want that, as long as the GPU can keep up? It'd be fine if it wasn't Windows/Linux. I really think ~ 135 ppi is about as small as I can handle without getting into zooming (assuming it works on your OS)
|
# ? May 23, 2012 15:07 |
|
bull3964 posted:I wonder how much of a game performance boost the NVIDIA 620M will give the UX32VD-DB71. I really like the 11.6" form factor as I want to maximize portability after owning my 15.6" E1505 for the past 6 years, but I would be willing to jump up to 13.3 if there was a significant increase in the play-ability of games. I'm not really looking for a gaming machine necessarily, but it seems like it would be dumb to overlook.
|
# ? May 23, 2012 19:59 |
|
Cicero posted:I couldn't find any reviews, but I would guess that the 620M is only a marginal improvement over the HD4000. Just trying to pin down what the hell any of nVidia's or AMD/ATI's low-end SKUs even means is an exercise in frustration. There's one that has, like, one Shader Module/pack of CUDA cores and the smallest possible number of ROP units which would be worse than HD4000 graphics by a stretch. I think that might be it, but it's hard to tell, because there are several of them with the same name, some of which are rebadged (or re-rebadged, no poo poo) Fermi products at that. I want to say that you're more likely to get a Kepler based piece of poo poo in a laptop and that they saved their "we are actively loving consumers with a product they don't need at all anymore" SKU nonsense for the lowest end discrete cards. But Fermi or Kepler, an _20 is going to be a piece of poo poo however you slice it and the HD4000 is actually pretty keen, contextually speaking.
|
# ? May 23, 2012 21:11 |
|
bull3964 posted:I wonder how much of a game performance boost the NVIDIA 620M will give the UX32VD-DB71.
|
# ? May 24, 2012 01:05 |
|
Agreed posted:Just trying to pin down what the hell any of nVidia's or AMD/ATI's low-end SKUs even means is an exercise in frustration. There's one that has, like, one Shader Module/pack of CUDA cores and the smallest possible number of ROP units which would be worse than HD4000 graphics by a stretch. I think that might be it, but it's hard to tell, because there are several of them with the same name, some of which are rebadged (or re-rebadged, no poo poo) Fermi products at that. Would not be surprised at all if it was a rebadged 520M which is a rebadged 420M which is a rebadged 320M which is really a 250M which is basically a glorified 8800GT with half the shader units disabled. Not saying this is a fact but that it's even plausible goes to show how loving stupid nVidia's numbering schemes are.
|
# ? May 24, 2012 03:43 |
|
Nostrum posted:Would not be surprised at all if it was a rebadged 520M which is a rebadged 420M which is a rebadged 320M which is really a 250M which is basically a glorified 8800GT with half the shader units disabled. Not saying this is a fact but that it's even plausible goes to show how loving stupid nVidia's numbering schemes are.
|
# ? May 24, 2012 03:48 |
|
The 620m is at least a 28nm part with "up to" 28.8 GB/sec, so there is that. pre:GeForce GT 635M GeForce GT 630M GeForce GT 620M GPU and Process 40nm GF116 28nm GF117/40nm GF108 28nm GF117 CUDA Cores 96/144 96 96 GPU Clock 675MHz 800MHz 625MHz Shader Clock 1350MHz 1600MHz Memory Bus 192-bit 128-bit 128-bit Memory Up to 2GB DDR3/GDDR5 Up to 2GB DDR3 Up to 1GB DDR3
|
# ? May 24, 2012 15:56 |
|
Yeah, I'm just going to have to wait until some benchmarks are out when I can compare them side by side.
|
# ? May 24, 2012 21:03 |
|
Gwaihir posted:The 620m is at least a 28nm part with "up to" 28.8 GB/sec, so there is that.
|
# ? May 25, 2012 01:47 |
|
How does a GT520 fair against a SNB HD3000 or whatever it is? I'm especially curious with how it fairs under Linux at 1920x1080.
|
# ? May 25, 2012 02:11 |
|
Alereon posted:Integrated graphics has at least 21.3GB/sec though, 25.6GB/sec on current-gen hardware. GF117 has only 4 ROPs, so rendering at above ~720p is right out anyway.
|
# ? May 25, 2012 03:54 |
|
Happy_Misanthrope posted:Don't you mean "has at most"? Isn't that the total bandwidth available to the CPU - meaning the CPU is going to be taking a decent chunk of that?
|
# ? May 25, 2012 05:22 |
|
And there be the ULV and dual-core Ivies. Not gonna lie, it's a little intimidating to see the top-end i7 basically running dead even with my laptop's i5-2410M at half the power consumption. And that's not even counting the HD 4000 graphics.
|
# ? May 31, 2012 08:29 |
|
Cicero posted:I couldn't find any reviews, but I would guess that the 620M is only a marginal improvement over the HD4000. Alereon posted:Zero to negative. Nothing below a GT 635M will be competitive with integrated graphics, they only exist as a workaround for driver compatibility issues and to con people into paying more. The key limiter of performance on integrated graphics is memory bandwidth, and if a dedicated graphics card doesn't have significantly more memory bandwidth than system memory it isn't going to be any better. quote:Naturally, if you want a better gaming experience, there are plenty of options to choose from. Are they as small and sleek as an Ultrabook? Generally speaking, no. The closest we’ve come to seriously thin gaming laptops might be the Sony VAIO Z, but the Acer TimelineU is definitely moving in that direction. Going forward, we expect to see quite a few Ultrabooks launching with some form of NVIDIA Optimus graphics. We already know about the TimelineU M3 and the ASUS UX32A; the TimelineU is a 14” laptop with typical Acer components (e.g. low quality screen, mediocre build quality), but it has a potent GPU. ASUS is going for a better built Ultrabook with a 1080p IPS LCD on some models (though not on the $799 model), but with a slower GT 620M GPU. The GT 620M is still a step up from the previous generation GT 540M, however, with core/shader clocks of 700/1400MHz, so it should provide for decent gaming. We’ll report more when we have a test unit in hand.
|
# ? May 31, 2012 18:49 |
|
While the GT 620M has much better power usage than the GT 540M, in terms of performance the GT 620M is a definite downgrade from the GT 540M, which is why a rebadged GT 540M is positioned as a GT 630M. The Asus N56VM in the benchmarks on your link is using one of the higher-clocked GT 630M variants, and it either breaks even or slightly leads the HD 4000 in most games, with only Arkham City, Dirt 3, and Skyrim seeing meaningful gains over the integrated graphics. Clock it 100-200Mhz slower and it will be still be faster in those three games (though probably no longer by enough to make it a meaningfully better experience), but slower in nearly everything else. The bottom line is just that unless you put in a videocard with GDDR5 (and a minimum of 8 ROPs for gaming above 768p), the biggest difference is that you no longer have to deal with Intel drivers.
|
# ? Jun 1, 2012 03:40 |
|
Sorry if this is the wrong place to ask, but I'm looking to build a new PC soon and was wondering if I could get some advice on wether to go for the Ivy Bridge, or splash on this beast: http://ark.intel.com/products/63696 I'm aware the Ivy Bridge is more power-efficient and I can grab one for about £250, whereas the 3960X is about £680-£750 and seems to be massively more powerful. I'm looking for a powerful machine, so, assuming the answer isn't obvious, which CPU would you choose? Is the Ivy Bridge worth going for?
|
# ? Jun 1, 2012 16:34 |
|
Sandy Bridge-E hexa-core is still the highest end platform (especially if you need a crazy amount of RAM), but you just need to know whether it'll be worth the money.. depends what you do, really, but I doubt you'd be limited by a regular Ivy Bridge chip for a lot less. http://www.anandtech.com/bench/Product/443?vs=551 HalloKitty fucked around with this message at 16:52 on Jun 1, 2012 |
# ? Jun 1, 2012 16:48 |
|
And of course, it only does you any good if you can actually make use of six cores. You'll get the best bang for your buck if you get the cheapest current generation i5, or i7 if you need heavily threaded performance. There are so many better ways to spend the price difference between the mid-line and high-end processors.
|
# ? Jun 1, 2012 17:25 |
|
Definitely go to the system building thread and ask (AFTER READING THE OP). "Powerful" PCs mean different things to different people, and that thread is a great place to start.
|
# ? Jun 1, 2012 17:29 |
|
Zhentar posted:And of course, it only does you any good if you can actually make use of six cores. The big draw is the 8 RAM slots - if you need it, it's one of those things vv In that case you're not the person Sandy Bridge-E is for. Definitely just get yourself a 3570k or a 3770k and a nice Z77 board HalloKitty fucked around with this message at 19:34 on Jun 1, 2012 |
# ? Jun 1, 2012 17:44 |
|
Thanks for the advice people, my needs aren't anything like rendering or super high-end stuff, so it sounds like the ivy bridge is my best bet. In general I'm looking for just speedy performance, so I don't think I can justify a splurge on a ~£800 CPU.
|
# ? Jun 1, 2012 18:54 |
|
Well, this has been a long time in the making. More precisely, six years of research into x86-based GPUs and highly parallel processors. That thing is 50 original Pentium cores with added 16-wide vector and FP64 hardware. And as you might notice, there are no video outputs. Intel is not pretending that's a GPU; it's a supercomputer co-processor designed to compete with Nvidia's Tesla. AnandTech
|
# ? Jun 19, 2012 16:16 |
|
Factory Factory posted:
My favorite detail about Phi is that they are functionally a complete compute node, so you can SSH into them and get a Linux environment.
|
# ? Jun 20, 2012 20:45 |
|
I tried convincing the validation guys to mine Bitcoins as part of their testing. Either they ignored me...or they are rich beyond their wildest dreams.
|
# ? Jun 20, 2012 21:42 |
|
My 4.9 GHz 2500k just died. The MB (P8P67 Pro) and CPU simultaneously died, so I don't know what caused what. I don't think it's the overclocking (I never went over 1.38 for 24/7 use). In any case though, Intel didn't ask any questions about that - just gave me a RMA number right away. I wonder if Intel's new "overclocking warranty" has any actual use, since Intel has this don't-ask-don't-tell policy about overclocking anyways.
|
# ? Jun 28, 2012 05:48 |
|
Only if you leave scorch-marks.
|
# ? Jun 28, 2012 15:20 |
|
Any ETA for IB-E?
|
# ? Jun 28, 2012 15:55 |
|
As of March:
|
# ? Jun 28, 2012 20:59 |
|
When are they going to release the ivy bridge version of the 6/8/10 core extreme edition? Was hoping they'll push towards 8~10 cores for their extreme edition when they improved their TDP using their 22nm process.
|
# ? Jun 29, 2012 08:38 |
|
|
# ? Apr 25, 2024 22:15 |
|
That's literally the question that was asked and answered in the two posts above yours.
|
# ? Jun 29, 2012 08:45 |