Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Anime Schoolgirl
Nov 28, 2002

HalloKitty posted:

Yay for awful thermal performance without delidding on Skylake!



If it's such a cost issue as they always say, why was it so viable in the Sandy Bridge days, when Sandy Bridge cost less comparatively than the newer CPUs?

If only they would offer a die with no GPU and the IHS soldered, it would no doubt cost less to produce overall, and be better in every way that matters to someone buying a top-end CPU for the desktop.
It's like Intel wants people to break these delidding them and buy more chips because that's not within warranty

Adbot
ADBOT LOVES YOU

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

HalloKitty posted:

Yay for awful thermal performance without delidding on Skylake!



If it's such a cost issue as they always say, why was it so viable in the Sandy Bridge days, when Sandy Bridge cost less comparatively than the newer CPUs?

If only they would offer a die with no GPU and the IHS soldered, it would no doubt cost less to produce overall, and be better in every way that matters to someone buying a top-end CPU for the desktop.

Ehhh, it would also mean setting up another production line to serve a very small market, a market that they have already captured because AMD can't compete. Production lines are not cheap.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
I had seen some people say the smaller dies of the latest chips is also a reason?

Josh Lyman
May 24, 2009


Don Lapre posted:

I had seen some people say the smaller dies of the latest chips is also a reason?
Smaller dies shouldn't preclude soldering the heatspreader.

Botnit
Jun 12, 2015

Managed to get the first Maximus VIII Gene on Amazon but now stuck without the 6700k.

Almost gave in and got one from the UK for 80% markup, there's one on Amazon for the same from Israel but he only indicates in the shipping page that it will ship next month. Basically a scam at that point.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

HalloKitty posted:

If it's such a cost issue as they always say, why was it so viable in the Sandy Bridge days, when Sandy Bridge cost less comparatively than the newer CPUs?

If only they would offer a die with no GPU and the IHS soldered, it would no doubt cost less to produce overall, and be better in every way that matters to someone buying a top-end CPU for the desktop.

Probably because when SB dropped people still at least remembered when AMD produced competitive products.

AVeryLargeRadish posted:

Ehhh, it would also mean setting up another production line to serve a very small market, a market that they have already captured because AMD can't compete. Production lines are not cheap.

I guess that goes back to "how hard is it to simply not include the iGPU" when making the chip. Clearly the addition/subtraction of a few megs of L3 cache isn't a big deal, and what specific iGPU goes into chips varies quite a bit, as well, so there's already some precedent. It does seem silly, though; if you're going to make a "gamer oriented chip," why waste even $1 on an iGPU that your entire market demographic is going to disable the moment they get it? Then again, they did the same thing with Devil Canyon, so who knows.

frunksock
Feb 21, 2002

MaxxBot posted:

What's wrong with it? Just curious, they seem to still be significantly above average as far as PC hardware review sites go.
I believe you that it is (above average relative to other reviews). I haven't read any in a long time. One thing I used to like about Anand's reviews was that he did a good job explaining the architecture's features to someone with roughly my level of understanding (basic nerd, but I don't work in silicon). It's something that's rare because it requires that the author actually understands it very thoroughly while also being a good writer.

Ian's review kind of drops a lot of text in a way that makes me think he probably doesn't understand what he's describing as thoroughly as Anand did. It's also just not as well written. There's a lot of pretty awkward prose. The fact that he spent like two pages of the review talking about "IPC" without bothering to actually expand the acronym was annoying (he helpfully directs me to click on a link to a previous review where they measured IPC instead -- I can find the expansion buried there). I spent at least three confused seconds wondering why the gently caress there are a set of benchmarks specific to interprocess communication.

He'll spend a paragraph where it seems like he's trying to explain very basic concepts like I'm five (e.g., caching), which is awkward because I have to mentally translate the babytalk ("oh, he's talking about caching"), but at the same time, in other parts of the review, he breezes past stuff that could stand to be elaborated on / explained more thoroughly.

I haven't reread any of Anand's old reviews, so it may be my memory playing tricks on me, but I also recall Anand doing a better job of pulling it together at the end into a reasonable big-picture view of what it all means and who should care and why. I didn't really follow how he got from the meat of the review to the conclusion that I should upgrade my 2600K (which it doesn't seem like I should actually do). It's probably also true that I'd be less critical of the review if I were happier with the benchmark results!

Chuu
Sep 11, 2004

Grimey Drawer

HalloKitty posted:

If only they would offer a die with no GPU and the IHS soldered, it would no doubt cost less to produce overall, and be better in every way that matters to someone buying a top-end CPU for the desktop.

The market is probably not big enough to justify creating a set of masks and test equipment just for this configuration. It might actually be counter-productive for enthusiasts to remove the GPU since when not in use it's basically a small on-die heatsink. Xeons allower higher max turbo frequencies if you disable cores because of this thermal effect.

Ffycchi
Jun 4, 2014

Sigh...challenge accepted...shitty photoshop incoming.

frunksock posted:

I believe you that it is (above average relative to other reviews). I haven't read any in a long time. One thing I used to like about Anand's reviews was that he did a good job explaining the architecture's features to someone with roughly my level of understanding (basic nerd, but I don't work in silicon). It's something that's rare because it requires that the author actually understands it very thoroughly while also being a good writer.

Ian's review kind of drops a lot of text in a way that makes me think he probably doesn't understand what he's describing as thoroughly as Anand did. It's also just not as well written. There's a lot of pretty awkward prose. The fact that he spent like two pages of the review talking about "IPC" without bothering to actually expand the acronym was annoying (he helpfully directs me to click on a link to a previous review where they measured IPC instead -- I can find the expansion buried there). I spent at least three confused seconds wondering why the gently caress there are a set of benchmarks specific to interprocess communication.

He'll spend a paragraph where it seems like he's trying to explain very basic concepts like I'm five (e.g., caching), which is awkward because I have to mentally translate the babytalk ("oh, he's talking about caching"), but at the same time, in other parts of the review, he breezes past stuff that could stand to be elaborated on / explained more thoroughly.

I haven't reread any of Anand's old reviews, so it may be my memory playing tricks on me, but I also recall Anand doing a better job of pulling it together at the end into a reasonable big-picture view of what it all means and who should care and why. I didn't really follow how he got from the meat of the review to the conclusion that I should upgrade my 2600K (which it doesn't seem like I should actually do). It's probably also true that I'd be less critical of the review if I were happier with the benchmark results!

No...your mind is not playing tricks....Anand was great even going back to his days at CPU magazine. I spend most of my time since he slowed down/stopped and have moved to sorting through guru3d and overclockers shitpost's to find actual relevant information...and here of course...which actually seem's to be the best place.

sincx
Jul 13, 2012

furiously masturbating to anime titties
.

sincx fucked around with this message at 05:55 on Mar 23, 2021

Mr Chips
Jun 27, 2007
Whose arse do I have to blow smoke up to get rid of this baby?

Generic Monk posted:

does 'portable workstation' mean those 2 inch thick dell business monstrosities

Not necessarily. For example, the HP Elitebook 840 G1 has the same chassis as the HP ZBook 14 'Mobile Workstation' (33.89 x 23.7 x 2.25 cm, ~1.6KG). The substantive differences seem to be a different LCD panel and AMD GPU on has FirePro firmware instead of Radeon firmware.

Assepoester
Jul 18, 2004
Probation
Can't post for 11 years!
Melman v2
So does the QUAD CHANNEL ability of Skylake show any advantage over plain old DUAL CHANNEL, such that it would make mATX or ATX builds with 4x4gb sticks (seemingly pretty much the only DDR4 memory anyone sells right now) an advantage over, say, a mITX build with 2x8gb?





SpelledBackwards posted:

That reminds me, what was the hot new PC tech magazine that popped up in the late '90s or early '00s and featured Anand and people along the lines of him or John Romero for guest columns? I definitely subscribed to that one for a while.
T3?

Assepoester fucked around with this message at 04:44 on Aug 10, 2015

Anime Schoolgirl
Nov 28, 2002

Skylake non-LGA2011v5 is still dual-channel

Ffycchi
Jun 4, 2014

Sigh...challenge accepted...shitty photoshop incoming.

literally 3 posts up I commented on it >.>

Publications
Anand is also the author of the book AnandTech Guide to PC Gaming Hardware (ISBN 0-7897-2626-2) and has a regular column in CPU Magazine called Anand's Corner.
Source: https://www.wikiwand.com/en/Anand_Lal_Shimpi

Botnit
Jun 12, 2015



I first thought about this one, mainly because every month would have Romero columns talking poo poo about everybody in it, then they gave her a column rebuttal, it was a very weird feeling being barely a teenager and having enough self awareness to realize "this should be really professionally inappropriate".

Botnit fucked around with this message at 05:30 on Aug 10, 2015

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me
ACTUAL Game Designer

SpelledBackwards
Jan 7, 2001

I found this image on the Internet, perhaps you've heard of it? It's been around for a while I hear.

Ffycchi posted:

No...your mind is not playing tricks....Anand was great even going back to his days at CPU magazine. I spend most of my time since he slowed down/stopped and have moved to sorting through guru3d and overclockers shitpost's to find actual relevant information...and here of course...which actually seem's to be the best place.

Ding ding ding, that's the mag, thanks. The name was staring me in the face the whole time. Computer Power User (CPU) Magazine. I was probably wrong about Romero (didn't know about PC Accelerator, ha). Not sure who I was else I was thinking of, though the Wikipedia article does mention Chris Pirillo.

Edit: Does this mean now Stevie Case is going to make us her bitch?

SpelledBackwards fucked around with this message at 05:33 on Aug 10, 2015

Ragingsheep
Nov 7, 2009

DrDork posted:

I guess that goes back to "how hard is it to simply not include the iGPU" when making the chip. Clearly the addition/subtraction of a few megs of L3 cache isn't a big deal, and what specific iGPU goes into chips varies quite a bit, as well, so there's already some precedent. It does seem silly, though; if you're going to make a "gamer oriented chip," why waste even $1 on an iGPU that your entire market demographic is going to disable the moment they get it? Then again, they did the same thing with Devil Canyon, so who knows.

Would it be possible to eventually use the iGPU for something like physics in games (similar to how you can use a second graphics card for PhysX).

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Ragingsheep posted:

Would it be possible to eventually use the iGPU for something like physics in games (similar to how you can use a second graphics card for PhysX).

Sure, but that would involve NVidia sharing their PhysX IP, and I'm pretty sure you can guess how likely that is to happen.

Closest thing to useful I've seen out of the Intel iGPUs is that they are available to some video compression platforms and can encode stuff impressively fast (though apparently almost always at lower quality than what CPU-pure encoding of the same video would produce).

WhyteRyce
Dec 30, 2001

PC Accelerator was the poo poo if you were a teen and into computers. That poo poo knew how to laser target their demographic

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

incoherent posted:

Anandtech: "Sandy bridge your time is up"

**Graphs show poor performance in gaming vs sandy bridge, the single deciding factor to upgrade for most people on the 2600K cpus"

I know that the biggest gains you'll get is with video card upgrade, but 25% increase in cpu performance but detectable losses on the graphic benchmarks don't add up to that claim the sandy bridge should be sunsetted.

AT just went full retard on that one. SB is still way comfortably above the baseline of acceptable CPU gaming performance that telling existing SB users to sink $500+ into a 25% "faster" new platform that barely translates into any real world advantage is downright idiotic.

Panty Saluter
Jan 17, 2004

Making learning fun!

Botnit posted:



I first thought about this one, mainly because every month would have Romero columns talking poo poo about everybody in it, then they gave her a column rebuttal, it was a very weird feeling being barely a teenager and having enough self awareness to realize "this should be really professionally inappropriate".

Ah, computers and video games....where a 6 can feel like a 10

Josh Lyman
May 24, 2009


Panty Saluter posted:

Ah, computers and video games....where a 6 can feel like a 10
To 13 to 29-year-old computer nerds, that cover is a honeypot.

Freakazoid_
Jul 5, 2013


Buglord

Josh Lyman posted:

To 13 to 29-year-old computer nerds, that cover is a honeypot.

As sexual appeal, or because she wrote a guide for daikatana?

DaNzA
Sep 11, 2001

:D
Grimey Drawer
I'm getting a diakatana iykwim.


Also seems like it's better to spend money on something like a 980Ti now if you want to just increase your FPS in games.

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

Ragingsheep posted:

Would it be possible to eventually use the iGPU for something like physics in games (similar to how you can use a second graphics card for PhysX).

DirectX 12 allows for dissimilar GPUs to be used together. One of the examples they give is using a discrete GPU to render and then your slower iGPU to do the post processing while the disctrete GPU renders the next frame. They got something like an extra 10% FPS, but it added nearly 2x the latency between frames.

Nam Taf
Jun 25, 2005

I am Fat Man, hear me roar!

So I'm still on an i7 920. It's beginning to get a little bit long in the tooth, so I'm wondering what to do. Is it worth actually looking towards Skylake and DDR4 or is it just better to go with a 2nd-hand Haswell era chip and be done with it? I have 18GB total of triple channel RAM, half 10666 and half 12800 so I don't think I can really re-use that as it seems everyone went back to dual channels?

I was sort of waiting on Skylake to drop and see how it's panning out now, but by the looks of it, it may just be easier to cycle a 2nd-hand Haswell than actually pay the latest-gen premium. I don't really see a benefit of early-adopting DDR4 because the performance seems to be a near-run and the LGA is just going to change so I can't recycle the mobo in a few years anyway.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Nam Taf posted:

So I'm still on an i7 920. It's beginning to get a little bit long in the tooth, so I'm wondering what to do. Is it worth actually looking towards Skylake and DDR4 or is it just better to go with a 2nd-hand Haswell era chip and be done with it? I have 18GB total of triple channel RAM, half 10666 and half 12800 so I don't think I can really re-use that as it seems everyone went back to dual channels?

I was sort of waiting on Skylake to drop and see how it's panning out now, but by the looks of it, it may just be easier to cycle a 2nd-hand Haswell than actually pay the latest-gen premium. I don't really see a benefit of early-adopting DDR4 because the performance seems to be a near-run and the LGA is just going to change so I can't recycle the mobo in a few years anyway.

What's driving your upgrade, do you need more CPU power and have a highly threaded workload? If so, you want a 5820K and X99. Do you want a more modern chipset, onboard stuff, and peripherals? Either H97 or Skylake should meet your needs unless you are imminently expecting to buy a PCI-E SSD, in which case you want Skylake.

If you just want single threaded CPU speed, then a 4790K is the way to go. If you've waited this long and don't mind waiting more, the non-overclocking Skylake chips look like they'll be more competitive overall. Not faster than a 4790K / 6700K, but with way better thermals, power consumption, and that price premium on platform should shrink when H170 launches.

Lolcano Eruption
Oct 29, 2007
Volcano of LOL.

DrDork posted:

I guess that goes back to "how hard is it to simply not include the iGPU" when making the chip. Clearly the addition/subtraction of a few megs of L3 cache isn't a big deal, and what specific iGPU goes into chips varies quite a bit, as well, so there's already some precedent. It does seem silly, though; if you're going to make a "gamer oriented chip," why waste even $1 on an iGPU that your entire market demographic is going to disable the moment they get it? Then again, they did the same thing with Devil Canyon, so who knows.

For the desktop market, Intel really only fabs three chips, dual core mobile, quad core mobile, and 8 core enterprise. The desktop market just receives the "waste" of these production lines. This could mean some faulty cache, being too leaky, or having some cores being non-operational.

The worse dual core mobile bins become the celerons, pentiums, and i3s. The worse quad core mobile bins become i5s and i7s. Lastly, the worse binned server chips become the HEDT chips.

Therefore, in the end, it doesn't cost Intel anything to include the iGPU on the mainstream desktop chips because they are already there as a result of those chips being originally fabbed for mobile.

sincx
Jul 13, 2012

furiously masturbating to anime titties
.

sincx fucked around with this message at 05:55 on Mar 23, 2021

Swartz
Jul 28, 2005

by FactsAreUseless
I'm debating a Skylake upgrade. I'm on a 2500k @ 4.2ghz. It has served me quite well, and I know that switching to Skylake would be expensive due to needing a new motherboard and RAM as well.

What I'm most interested in, and haven't seen much of, is benchmarks in single-threaded games. The reason I bring this up is the main game I play on my PC is Stalker: Call of Pripyat because I'm always actively modding it, and although it technically uses 2-cores, it's mostly 1-core.

If Skylake doesn't improve from Sandy Bridge much on single-threaded performance I'm thinking I should at least wait until Skylake Refresh, when hopefully higher-clocks and/or 6-core processors are out.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Swartz posted:

I'm debating a Skylake upgrade. I'm on a 2500k @ 4.2ghz. It has served me quite well, and I know that switching to Skylake would be expensive due to needing a new motherboard and RAM as well.

What I'm most interested in, and haven't seen much of, is benchmarks in single-threaded games. The reason I bring this up is the main game I play on my PC is Stalker: Call of Pripyat because I'm always actively modding it, and although it technically uses 2-cores, it's mostly 1-core.

If Skylake doesn't improve from Sandy Bridge much on single-threaded performance I'm thinking I should at least wait until Skylake Refresh, when hopefully higher-clocks and/or 6-core processors are out.

How badly do you want that single threaded performance? If you splash out for a 6700K, it should be about 15-25% faster single threaded than a 2500K @ 4.2. It's certainly not a huge leap.

Swartz
Jul 28, 2005

by FactsAreUseless

Twerk from Home posted:

How badly do you want that single threaded performance? If you splash out for a 6700K, it should be about 15-25% faster single threaded than a 2500K @ 4.2. It's certainly not a huge leap.

It's not imperative, but it would be very nice. The main issue with Stalker is that it's very CPU intensive and due to using just one core it tends to have lots of stuttering (and it can't be an I/O issue as I'm on a SSD, and it's not GPU as I have a GTX 970).

25% would be nice, if it had a minimum of that I'd upgrade, otherwise I think I'll wait until Skylake Refresh or maybe even whatever is after that, though I was hoping to build my new pc in mid-2016.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Swartz posted:

It's not imperative, but it would be very nice. The main issue with Stalker is that it's very CPU intensive and due to using just one core it tends to have lots of stuttering (and it can't be an I/O issue as I'm on a SSD, and it's not GPU as I have a GTX 970).

25% would be nice, if it had a minimum of that I'd upgrade, otherwise I think I'll wait until Skylake Refresh or maybe even whatever is after that, though I was hoping to build my new pc in mid-2016.

There's another dark horse option: The i7-5775c. Techpowerup found it to outperform the 6700K on gaming workloads: http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/14

The issue is that how it performs varies greatly by game. If having 128MB of cache on package helps STALKER, the 5775c will deliver a much better experience than a 6700K. If it doesn't, then it's slower. Its one of those things that has to be tested to see.

NarDmw
Mar 23, 2008
Fun Shoe

Nam Taf posted:

So I'm still on an i7 920. It's beginning to get a little bit long in the tooth, so I'm wondering what to do. Is it worth actually looking towards Skylake and DDR4 or is it just better to go with a 2nd-hand Haswell era chip and be done with it? I have 18GB total of triple channel RAM, half 10666 and half 12800 so I don't think I can really re-use that as it seems everyone went back to dual channels?

I was sort of waiting on Skylake to drop and see how it's panning out now, but by the looks of it, it may just be easier to cycle a 2nd-hand Haswell than actually pay the latest-gen premium. I don't really see a benefit of early-adopting DDR4 because the performance seems to be a near-run and the LGA is just going to change so I can't recycle the mobo in a few years anyway.

I am in a similar situation with a i7 930, and found out that buying a used hexacore xeon x5670 (any xeon 5600 series chip really) for $75 can work. I can retain my motherboard, ram and have a Sandy Bridge equivalent chip (with overclock) for a cheap upgrade to hold me out for a little longer. It is worth looking into. Does anyone else have experience with upgrading a nehalem Bloomfield to a xeon westmere-ep?

Ak Gara
Jul 29, 2005

That's just the way he rolls.

Swartz posted:

It's not imperative, but it would be very nice. The main issue with Stalker is that it's very CPU intensive and due to using just one core it tends to have lots of stuttering (and it can't be an I/O issue as I'm on a SSD, and it's not GPU as I have a GTX 970).

25% would be nice, if it had a minimum of that I'd upgrade, otherwise I think I'll wait until Skylake Refresh or maybe even whatever is after that, though I was hoping to build my new pc in mid-2016.

It might be cheaper to buy a H110i GT and try to push your 2500K. It's possible to get faster single core speeds than even a 4790K.

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS

NarDmw posted:

I am in a similar situation with a i7 930, and found out that buying a used hexacore xeon x5670 (any xeon 5600 series chip really) for $75 can work. I can retain my motherboard, ram and have a Sandy Bridge equivalent chip (with overclock) for a cheap upgrade to hold me out for a little longer. It is worth looking into. Does anyone else have experience with upgrading a nehalem Bloomfield to a xeon westmere-ep?

Yeah, I did exactly what you did - tossed an old 920 and bought a X5670 Westmere Xeon, then threw a 212 Evo cooler on it and bumped it to 3.55ghz. It's still down a bit on per-core performance compared to my 2500k or 3570k machines, but I just wanted 12 threads to play with and only have a couple hundred bucks in the mobo/proc/HSF.

The Westmere Xeons are on 32nm fab rather than the 45nm process of the 920, so power draw isn't much more than the quad-core.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Twerk from Home posted:

There's another dark horse option: The i7-5775c. Techpowerup found it to outperform the 6700K on gaming workloads: http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/14

The issue is that how it performs varies greatly by game. If having 128MB of cache on package helps STALKER, the 5775c will deliver a much better experience than a 6700K. If it doesn't, then it's slower. Its one of those things that has to be tested to see.

Availability is an issue yes, but 5775C is criminally underrated. This is the only chip I would recommend a Z97 board since it has amazing performance for gaming at such low clocks and power draw.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




Palladium posted:

Availability is an issue yes, but 5775C is criminally underrated. This is the only chip I would recommend a Z97 board since it has amazing performance for gaming at such low clocks and power draw.

It's also more expensive than a 6700k by MSRP, which will likely hold true if availability is an issue. Also, most Z97 motherboards will require a BIOS update before they can take an i5-5775c. So that's another pain in the rear end, I don't think they can update their BIOS without an already compatible chip.

Adbot
ADBOT LOVES YOU

NarDmw
Mar 23, 2008
Fun Shoe

JnnyThndrs posted:

Yeah, I did exactly what you did - tossed an old 920 and bought a X5670 Westmere Xeon, then threw a 212 Evo cooler on it and bumped it to 3.55ghz. It's still down a bit on per-core performance compared to my 2500k or 3570k machines, but I just wanted 12 threads to play with and only have a couple hundred bucks in the mobo/proc/HSF.

The Westmere Xeons are on 32nm fab rather than the 45nm process of the 920, so power draw isn't much more than the quad-core.

Did you find that to be a useful/good value upgrade that allowed you to game at 1080p?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply