Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
JawnV6
Jul 4, 2004

So hot ...

Alereon posted:

Where are our 8GB DIMMs already?

At least 300ns away, according to the JEDEC spec.

Adbot
ADBOT LOVES YOU

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

incoherent posted:

Those x79 boards and the memory slots :stare:

http://www.anandtech.com/show/4793/x79-motherboards-from-gigabyte-msi-at-idf-2011
Goddamn, if the SB-E CPUs wouldn't be that loving expensive, I'd upgrade to LGA2011 just for the memory slots. 4x4GB is waaaaayyyyy cheaper than 2x8GB.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Combat Pretzel posted:

Goddamn, if the SB-E CPUs wouldn't be that loving expensive, I'd upgrade to LGA2011 just for the memory slots. 4x4GB is waaaaayyyyy cheaper than 2x8GB.

How much more than an LGA1366 setup will they be? (6x4GB)

Also, I thought they'd make a 'server' LGA1366 board with 12 slots but Newegg doesn't carry them.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Bob Morales posted:

How much more than an LGA1366 setup will they be? (6x4GB)

Also, I thought they'd make a 'server' LGA1366 board with 12 slots but Newegg doesn't carry them.
I don't think it's possible to have more than 2 DIMMs per channel for unbuffered DDR3. The multi-proc server boards use memory buffer chips, for example.

movax
Aug 30, 2008

JawnV6 posted:

At least 300ns away, according to the JEDEC spec.

What do you mean? There are 8GB unbuffered DIMMs out right now it looks like?

Bob Morales posted:

How much more than an LGA1366 setup will they be? (6x4GB)

6x4GB DDR3 non-ECC/non-registered RAM was retard cheap for me, I think I paid under $200 from the 'egg.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

movax posted:

What do you mean? There are 8GB unbuffered DIMMs out right now it looks like?


6x4GB DDR3 non-ECC/non-registered RAM was retard cheap for me, I think I paid under $200 from the 'egg.

I just maxed out one my VM boxes a few weeks ago:



20-139-046 MEM 4G|KST KVR1333D3N9/4G/ RT 6 $23.99 $143.94

They're $2/cheaper each today ($21.99)

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Anandtech has a pretty thorough article up about the Ivy Bridge architecture.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


God. Haswell looks so much more awesome, good loving god. 10-20w mainstream CPU and it's faster than current gen and even cheaper :aaa:

freeforumuser
Aug 11, 2007
My god, another 4-6% performance per clock over SB? What drugs are Intel on?

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Heh. I seem to remember them throwing an 20% value around in the past.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Welp I'm fine with my Sandy Bridge now thanks!

At least this is some good news for AMD, poo poo was bad enough when they had to do their expectations management to shave their 50% clock for clock down to 35%, if Intel did manage a 20% clock for clock improvement with Ivy Bridge that would have had a sound kinda like a nail going into a coffin for this generation of product - now, not so much, maybe AMD can get some more efficiency out of Bulldozer.

Edit: That MOV instruction trick is really clever. Intel is kinda cool.

Agreed fucked around with this message at 17:59 on Sep 17, 2011

karoshi
Nov 4, 2008

"Can somebody mspaint eyes on the steaming packages? TIA" yeah well fuck you too buddy, this is the best you're gonna get. Is this even "work-safe"? Let's find out!

Lovely chipset table, nice segmenting to squeeze all the dollars from the high-end must have all boards. Want to have SSD caching? Oh buy the cheapest boards that cant overclock. Oh, you wanted to overclock, too? Yeah, no, our middle of the range boards dont support that, can i interest you in this FARTAL1tY ROG XTR3M GTX 997 board for only $359? You can triple SLI with this puppy, too!

Is there a difference other than marketing (and 3x graphics support) between the z77 and z75? And aren't the graphics PCIe bundles controlled by the CPU anyway? So the bios checks the installed chipset and sets a flag on the CPU to block 3x SLI, because they havent installed a chip that's outside of the electrical path between CPU and GPU(s) anyway. So effectively this is like the famous SLI tax of the past where it wouldnt SLI unless it had a uselessSLI-optimizer chip on the board. The 891 triple SLI users are getting ripped I'd say.

Zhentar
Sep 28, 2003

Brilliant Master Genius
To be fair, even if the chip supports a feature, there is a non-zero cost to validating the feature (and rejecting failures) and offering support for the feature.


Reading about the power savings and GPU performance makes me feel bad about just buying a sandy bridge laptop. Curse you and your continual advancement, technology!

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I know they're advertising an increased maximum multiplier and better power efficiency, but I am really curious what kind of overclocking performance enthusiasts will be able to pull out of this thing. The physical space occupied by individual components in the microarchitecture is getting really tiny, and doesn't the much smaller die mean that the bumps themselves are getting smaller as well? More vulnerable architecture and power delivery, offset by more efficient power and heat characteristics.

Wonder if we'll be seeing these things at the kinds of clocks people are getting with Sandy Bridge. Only time will tell, I guess, but it's pretty amazing how many people are running 4.5GHz on their 2500K/2600K, and some of us are lucky enough to get 'em even higher without getting into unsafe power and heat territory. I don't expect anyone apart from the "stable enough to boot and take a screenshot before the chip cooks" [H] guys to be coming anywhere near topping out its multiplier, but it will be interesting to see if Ivy Bridge's architectural balance in favor of efficiency means that they'll keep up with or even surpass Sandy Bridge's nearly trivial overclocking and put that 6% clock for clock to work for enthusiast setups.

Everything's so small, but I guess a big part of their jobs designing the processor and chipsets is to consider all that and deliver a final product that can keep up with and surpass its predecessor. Still, going to be fun to watch for awhile as people do the trial and error involved with establishing apparently safe parameters!

Agreed fucked around with this message at 01:14 on Sep 18, 2011

freeforumuser
Aug 11, 2007

karoshi posted:

Lovely chipset table, nice segmenting to squeeze all the dollars from the high-end must have all boards. Want to have SSD caching? Oh buy the cheapest boards that cant overclock. Oh, you wanted to overclock, too? Yeah, no, our middle of the range boards dont support that, can i interest you in this FARTAL1tY ROG XTR3M GTX 997 board for only $359? You can triple SLI with this puppy, too!

Is there a difference other than marketing (and 3x graphics support) between the z77 and z75? And aren't the graphics PCIe bundles controlled by the CPU anyway? So the bios checks the installed chipset and sets a flag on the CPU to block 3x SLI, because they havent installed a chip that's outside of the electrical path between CPU and GPU(s) anyway. So effectively this is like the famous SLI tax of the past where it wouldnt SLI unless it had a uselessSLI-optimizer chip on the board. The 891 triple SLI users are getting ripped I'd say.

It's funny the same PC industry that complains that PC gaming is too expensive is the same one that also sells overpriced "gaming" hardware.

And speaking of SLI, my friend bought an $250 680i triple SLI mobo back in the E6600 days with a 8800GTS 320MB only to end up with a single 9600GT before the whole thing was replaced with a 2500K + 6870. Thats an $150 extra paid for nothing.

SRQ
Nov 9, 2009

So, at what point will the equivalent new processor to an i5 750 be twice as powerful? As soon as something double my power is in the 2-300 range I'll jump all over it.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

SRQ posted:

So, at what point will the equivalent new processor to an i5 750 be twice as powerful? As soon as something double my power is in the 2-300 range I'll jump all over it.
Unless you use applications that are extremely well-threaded and can take advantage of 12+ cores, I'd say it will probably be 2013 and the Haswell cores at the absolute earliest. Intel is mostly focusing on power- and die area-efficiency right now, as well as graphics performance, and those are going to pay off mostly in the mobile arena. Desktop doesn't really have much to push a CPU right now.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

Desktop doesn't really have much to push a CPU right now.

That's true, I don't know of much on the desktop that requires the kind of horsepower we already have and have had for years. Still, the next year should be interesting for the enthusiast desktop user. Even though Intel is making smart moves, AMD is finally positioned to do something that might actually attract competitive business for desktop users (as well as hopefully get their prices right for server use, god knows they need it).

You probably caught Bulldozer's world-record 8+ghz with some absurd custom cooling system, as usual for that kind of showy silliness, but the message is clear. AMD is pushing for big numbers and attached to that successful record's press release was the calculated off-hand remark about how easy it was for them to achieve 5GHz+ with air cooling. Given that Sandy Bridge K-sku users mostly run at around 4.0GHz to 4.5GHz, with some exceeding but few getting within a multiplier or two of 5GHz. If AMD's not completely dishonest here (and I make no judgment there, their high clock was verified and it'd be a pretty lovely time to lie), they may be able to leverage more realistic big-number overclocks and really compete for performance (not to mention that we still don't really know how Bulldozer's semi-cores are going to affect real world performance). If they can make it affordable, anyway. Intel seems to have the advantage at every turn but it's something.

Their expectation management regarding clock for clock improvement doesn't exactly impress, but if Bulldozer really does turn out to be an extraordinary overclocker as they're trying to demonstrate and they can keep their prices down, it could make the time between now and Haswell much more interesting.

Still, Intel is absolutely flush and their resources mean they can afford to keep their eye on the future, prepping their technological base for future improvements. It's hard to beat that much money and the architectural efficiency of Sandy Bridge and Ivy Bridge are impressive.

I've read that Intel's GPU is architecturally more integrated than AMD's. Anyone make sense of that claim for me? AMD is focusing so heavily on integrated graphics, is it true that Intel has superior integration (and if so, does that mean we can expect them to be able to scale their on-die graphics at a faster pace as well)?

Agreed fucked around with this message at 09:03 on Sep 19, 2011

karoshi
Nov 4, 2008

"Can somebody mspaint eyes on the steaming packages? TIA" yeah well fuck you too buddy, this is the best you're gonna get. Is this even "work-safe"? Let's find out!

Agreed posted:

I've read that Intel's GPU is architecturally more integrated than AMD's. Anyone make sense of that claim for me? AMD is focusing so heavily on integrated graphics, is it true that Intel has superior integration (and if so, does that mean we can expect them to be able to scale their on-die graphics at a faster pace as well)?

Intel's GPU is (better) integrated to their CPU cache architecture. This is very important because the bandwidth is less than even a low range discrete GPU (see AMD E-350 GPU performance scaling linearly with DDDR3 frequency (1066->1333)). This means intel's solution is more efficient on bandwidth starved situations, the higher the resolution, the smaller the difference between llano and SB. Still llano is raping SB.

Ivy will add L3 cache to the GPU, while still being able to use the LLC on the die.

See http://forum.beyond3d.com/showthread.php?t=60372&page=3

movax
Aug 30, 2008

Agreed posted:

I've read that Intel's GPU is architecturally more integrated than AMD's. Anyone make sense of that claim for me? AMD is focusing so heavily on integrated graphics, is it true that Intel has superior integration (and if so, does that mean we can expect them to be able to scale their on-die graphics at a faster pace as well)?

AMD has more than two decades of GPU R&D to leverage from their ATI acquisition and their GPUs and drivers (god I can't believe I'm saying that for loving ATI) are arguably "better" than Intel's, which is understandable when you own the entirety of ATI's IP.

However as karoshi said, Intel's GPU is highly integrated with the CPU and with cache sharing and other techniques, it makes up for a relative lack of bandwidth. Remember that a full-size videocard like the GTX460 could have 8GB/s of bandwidth (1x16, though performance is essentially identical at x8, which is still 4GB/s of bandwidth) to the CPU.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Thanks, folks. Interesting times.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

I looked up the i740 because I was curious if Intel ever released another graphics card:

Wiki posted:

In August 1999, after less than 18 months on the market, Intel withdrew the i740 from the market. In September Lockheed announced a "customer-focused organizational realignment" that shed many of its divisions, and then closed Real3D on 1 October 1999 (following Calcomp in late 1998). Intel purchased the company's intellectual property, part of a series on ongoing lawsuits, but laid off the remaining skeleton staff.Some staff were picked up as contractors within Intel, while a majority were hired by ATI and moved to a new office.

Nice.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Wonder how many of those guys survived the merger on the ATI side and how many are still around on the Intel side. A big small world if some of the guys who used to work together on graphics are now competing directly on graphics. Although it wouldn't surprise me under any circumstances, really, specialized industries are pretty incestuous and no noncompete is going to last 13 years.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Agreed posted:

Wonder how many of those guys survived the merger on the ATI side and how many are still around on the Intel side. A big small world if some of the guys who used to work together on graphics are now competing directly on graphics. Although it wouldn't surprise me under any circumstances, really, specialized industries are pretty incestuous and no noncompete is going to last 13 years.

Did ex-SGI guys start 3Dfx and then they all ended up going to NVIDIA?

You figure a bunch of the guys on the software side would go to places like Microsoft and game studios, then the hardware guys would get picked up to work on mobile devices and consoles.

JawnV6
Jul 4, 2004

So hot ...

Agreed posted:

Although it wouldn't surprise me under any circumstances, really, specialized industries are pretty incestuous and no noncompete is going to last 13 years.

I know one guy who's worked for Cadence 4 times. He's never been hired there, but he's worked for 4 different companies that were bought by Cadence.

movax
Aug 30, 2008

JawnV6 posted:

I know one guy who's worked for Cadence 4 times. He's never been hired there, but he's worked for 4 different companies that were bought by Cadence.

Cadence pretty much plays Corporate Spore IRL.

Anand's chipset table confused me. Docs I've seen report that Panther Point boards will run Sandy Bridge chips, BIOS support pending of course. Yet his table only listed 'IVB' for the x7x chipsets. Error, or did they say different at IDF? As far as I know, DMI hasn't changed...

freeforumuser
Aug 11, 2007
http://en.wikipedia.org/wiki/Haswell_(microarchitecture)

Most interesting bit about Haswell is the integrated southbridge. Not surprising, given the trend of integrating more and more circuity onto the CPU package itself.

This surely must make motherboard makers quake in their boots as they see Intel slowly removing them out of the equation. Intel can simply sell ready-made mobos with Haswell SoCs themselves, bypassing Asus & co completely.

Maybe that Intel's devious plan all along.

movax
Aug 30, 2008

freeforumuser posted:

http://en.wikipedia.org/wiki/Haswell_(microarchitecture)

Most interesting bit about Haswell is the integrated southbridge. Not surprising, given the trend of integrating more and more circuity onto the CPU package itself.

This surely must make motherboard makers quake in their boots as they see Intel slowly removing them out of the equation. Intel can simply sell ready-made mobos with Haswell SoCs themselves, bypassing Asus & co completely.

Maybe that Intel's devious plan all along.

Nah, the motherboard makers will still be in the equation, and they won't care. Less chips and easier PCB routing. PCI is dead, and PCI Express is much easier and much more tolerant to route. You could have the lanes of a PCIe link coming from a X58 mismatch with the link by nearly an inch and there would be no ill effects.

Lower cost for mobile too will be neat, no PCH + CPU, just one CPU to purchase. Desktop I imagine will still retain CPU + PCH over DMI (which will now probably be at 8GT/s, mirroring PCIe 3.0).

Henrik Zetterberg
Dec 7, 2007

freeforumuser posted:

http://en.wikipedia.org/wiki/Haswell_(microarchitecture)

Most interesting bit about Haswell is the integrated southbridge. Not surprising, given the trend of integrating more and more circuity onto the CPU package itself.

This surely must make motherboard makers quake in their boots as they see Intel slowly removing them out of the equation. Intel can simply sell ready-made mobos with Haswell SoCs themselves, bypassing Asus & co completely.

Maybe that Intel's devious plan all along.

The less stuff on the board, the cheaper it is for Asus to manufacture.

I like this move. It gives board makers less to screw up by putting in poo poo PCB routing that barely meets Intel specs.

Peechka
Nov 10, 2005
So how bout this new 2011 socket Intel is shipping in mid Nov, no one is really talking about this, but it seems like this new socket is what the Ivy Bridge should of been. The new performance leader by leaps and bounds.

quote:

So, we have, most likely, two six-core chips and one four-core model at the entry level for the initial launch.

The expected top end model, the six-core Core i7 3960X with 15MB of L3 cache, was in most of the demo systems seen at the show. Even though its clock frequency is expected to be just a notch lower than the current top end four-core 3.4GHz Sandy Bridge Core i7 2600K, or the six-core current incumbent, the Westmere-based 3.46GHz Core i7 990X, the performance is expected to be between 20 and 40 per cent higher than the Core i7 990X in most benchmarks, or 30 to 50 per cent higher than the Core i7 2600K.

So even with two cores off, the new processor delivers enough extra oomph to give upgraders a reason to open their wallets, while at the same time keeping a safe distance from the expected AMD Bulldozer offerings performance-wise.

http://www.theinquirer.net/inquirer/feature/2109948/intel-x79-chipset-socket-2011-ready-desktop

so he flagship of the new socket, i7 3960X seems to be a eight core processor, but at the show they disabled 2 cores.

Further reading...

quote:

In summary, aside from rumored chipset I/O issues, Intel's Socket 2011 platform seems ready to roll. In fact, we'd recommend it not just to the highest end desktop users but even to those users thinking of the current four-core Core i7 2600K. For a bit more money, you could take the entry level four-core Socket 2011 model. Yes, it will lose a bit on clock speed, but you'll get four memory channels instead of two, a larger cache, plus 40 PCIe V3 lanes for full dual GPU operation and individual x16 full bandwidth slots, without having to wait for Ivy Bridge in 2012. The extra eight PCIe V3 lanes left after covering two GPUs can be nicely used for a very fast direct attach PCIe SSD drive card, then.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
That further reading sounds like it's coming from a :rice: driver, honestly. Especially since those 20% to 50% figures are really, really suspect given the results Tom's Hardware got when actually running benchmarks and not just pooping numbers into their hands to smear onto the page. Plus quad-channel memory and 40 PCIe 3.0 lanes and extra CPU cache are about as useful for the average desktop user as a 1.5 kW power supply, i.e. not, and underutilized to boot.

TwoKnives
Dec 25, 2004

Horrible, horrible shoes!
Can anyone explain a little bit more about the integrated southbridge? Is it being done for performance reasons? Why now? Someone mentioned that there is a trend for integrating more circuitry into the CPU, which we've seen with CPU + GPU combos, but what's next in line potentially? Where does it end?

movax
Aug 30, 2008

Peechka posted:

So how bout this new 2011 socket Intel is shipping in mid Nov, no one is really talking about this, but it seems like this new socket is what the Ivy Bridge should of been. The new performance leader by leaps and bounds.


http://www.theinquirer.net/inquirer/feature/2109948/intel-x79-chipset-socket-2011-ready-desktop

so he flagship of the new socket, i7 3960X seems to be a eight core processor, but at the show they disabled 2 cores.

Further reading...

For your average desktop user...hell, even for your power desktop user, LGA2011 brings nothing to us except sucking more money out of our wallets. Here is a quick list:

1. Hexa-cores are cool, but games don't care. Real multithreaded applications and workflows will benefit, but those people are probably already running twin quad-core HT Xeons at a much lower price point.
2. PCIe 3.0 is useless right now. There are no PCIe 3.0 GPUs out, my buddies have nothing but complaints and bitching about their qualification testing for it (Xilinx and Intel in particular).
2a. [H]ardOCP showed a very minor (if at all) loss in performance after kneecapping PCIe bandwidth, proving that the bottleneck was still the GPU itself. PCIe 3.0 will benefit us when PCIe 3.0 GPUs coming out in the SLI and CF arena, because it delivers more bandwidth with less lanes, meaning we may not need bridge chips like the NF200 anymore. A x8 PCIe 3.0 link would deliver 6.4GB/s; a x16 2.0 link currently delivers 8.0GB/s and that isn't even stressed fully.
3. More memory channels, cool story bro, I guess. Obscene amounts of RAM is nice, but I would choose a SSD as the 1st upgrade before throwing in gobs of RAM.

So it's really cool if you're rich as hell and have money to spare, but it certainly doesn't obsolete our existing Sandy Bridge CPUs. If you have something that absolutely loves threads, congratulations, you're now "huge percentage faster", but the same can be achieved this very instance by purchasing a DP Xeon system from Dell or HP.

TwoKnives posted:

Can anyone explain a little bit more about the integrated southbridge? Is it being done for performance reasons? Why now? Someone mentioned that there is a trend for integrating more circuitry into the CPU, which we've seen with CPU + GPU combos, but what's next in line potentially? Where does it end?

An integrated southbridge simplifies things for all parties involved (well, excepting Intel's initial R&D and engineering to create the unified package in the first place). This is one less BGA to place on the board, one less BGA to X-Ray for inspection and one less BGA to suffer from tin whiskers. This also simplifies board layout, because you don't need to run traces connecting the SB to the CPU. Granted DMI is not that bad to route, but still, perhaps not having to route those 18 traces means they can put one less layer on the PCB, and therefore cut cost. The breakout region around the single-chip may get a little hairy, but the entire BGA itself will likely grow to support it.

That's also one less heatsink to purchase, because it's all in one package now. It would also benefit from moving to a smaller process, and draw less power as a result. For all we know, the interconnect on Haswell between "southbridge" and CPU may not be DMI anymore, but some kind of ultra high-speed link that only has to travel a few mm as opposed to a few inches.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Is a 30-50% guesstimate really worth 3-4x the price?

freeforumuser
Aug 11, 2007

Combat Pretzel posted:

Is a 30-50% guesstimate really worth 3-4x the price?

If gaming is the most intensive stuff you will do on your PC, LGA2011 is stupid as far as price/performance is concerned. 6 cores does nothing over 4 cores for games now and for at least the next 2-3 years.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Regardless, why are they comparing the 3960X, instead of the 3820, to the 2600K?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Combat Pretzel posted:

Regardless, why are they comparing the 3960X, instead of the 3820, to the 2600K?

Because the 3960X replaces the 990X, which the 2600K already competes with most of the time. And it's not crazy to see either 1) the effect of two more cores/extra cache vs. a 4-core chip with substantially similar architecture or 2) how the new platform's flagship performer compares to the current platform's flagship performer.

It's a preview. Full review with context and responsibility forthcoming.

Factory Factory fucked around with this message at 15:53 on Sep 23, 2011

Mr. Crow
May 22, 2008

Snap City mayor for life
The question I care about w/regard to the 3960x is will it drive prices down on anything else.

movax
Aug 30, 2008

Mr. Crow posted:

The question I care about w/regard to the 3960x is will it drive prices down on anything else.

Doubt it. They are selling plenty of SKUs at the current price point, they have zero reason to decrease price. They will just increase their profit margin as yields go up, good time to have Intel stock.

I mean what else are OEMs and enthusiasts going to use, AMD? :downsrim:

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

movax posted:

Doubt it. They are selling plenty of SKUs at the current price point, they have zero reason to decrease price. They will just increase their profit margin as yields go up, good time to have Intel stock.

I mean what else are OEMs and enthusiasts going to use, AMD? :downsrim:

Hey, you never know, Bulldozer could completely change the way we think about being a complacent, slightly depressing runner up! :smuggo:

Seriously, though, I hope it pans out well for AMD.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply