Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Alucardd
Aug 1, 2006


Factory Factory posted:

By the by, anyone concerned about competition and innovation in light of AMD's restructuring might take some solace if you read up on Joseph Schumpeter's theories of monopoly and innovation. In a nutshell, he said that monopolies can drive innovation because they have large amounts of capital which can be invested into novel research. As semiconductor design is a highly investment-driven industry, it's a perfect place for such a market dynamic.

In fact, you have to remember that the transistor(and hundreds of other things) was made possible by the near inexhaustible funds that AT&T funneled into Bell Labs.

Alucardd fucked around with this message at 06:11 on Feb 22, 2012

Adbot
ADBOT LOVES YOU

SRQ
Nov 9, 2009



As long as the monopoly wants to, and sees a profit in it, sure they'll shove money into R&D, but if they decide that it isn't worth it, well...

Zhentar
Sep 28, 2003

Brilliant Master Genius


SRQ posted:

If I understand this right, they took the original pentium design and die shrunk it + added new instruction sets?

Assuming you're talking about Claremont/NTV... what's special is they've managed to dramatically improve the lower bound on idle efficiency. With Sandy Bridge, built on the same process, essentially no matter how low you set the clock speed, you can't drop the voltage below .8v (don't quote me on that exact number). If your processor leaks power at, say, 1 watt at .8v, then the only way to get it to draw less than 1 watt is to turn some or all of it off entirely. Turning part or all of the processor on or off wastes some power in the process, and carries a small performance penalty (I think the major transitions usually take single digit microseconds).

What NTV does is let them keep dropping the voltage further. At .28v, your 1 watt of leakage power drops down to .12 watts, and you can switch in and out of that state with much less overhead. You can use it to better use brief idle periods where the on/off transition takes too long, and you can use it for cheaper wakeups where the processor only needs to turn on for a brief period for short, low level operations.

SRQ
Nov 9, 2009



Sounds like intel just made some serious inroads into the tablet market then.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Keep in mind Claremont is a concept demo and will not be productized. However, something like Silvermont with NTV technology could potentially be competitive with the Krait and Cortex A15 down the road.

Longinus00
Dec 29, 2005
Ur-Quan

Alucardd posted:

In fact, you have to remember that the transistor(and hundreds of other things) was made possible by the near inexhaustible funds that AT&T funneled into Bell Labs.

So you're saying we'll soon be leasing our CPUs from intel?

freeforumuser
Aug 11, 2007


Alereon posted:

EXPreview has some details and graphics benchmarks of the upcoming Ivy Bridge Core i5 3570K. HD Graphics 4000 is between 30-85% faster than HD 3000 in real games, up to more than twice as fast in 3DMark.

VR-Zone is also reporting that the previously announced delay for Ivy Bridge will only apply to dual-core mobile chips. Desktops and quad-core mobile chips will launch at the beginning of April as planned.

I was like "omg" when I saw IB GPU beats SB by 60% until I realised it's still much slower than my $150 mid-2008 9600GT (>> GT 240 DDR3).

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

freeforumuser posted:

I was like "omg" when I saw IB GPU beats SB by 60% until I realised it's still much slower than my $150 mid-2008 9600GT (>> GT 240 DDR3).
Yeah, I could even forgive that except for the sad reality that those games it was benchmarked with are probably the only games the drivers are compatible with. It's really hard to overstate how terrible Intel's graphics drivers are, not even minor bugs but poo poo like games not working at all and going months and months between graphics driver updates.

Beelzebubba9
Feb 24, 2004


Factory Factory posted:

By the by, anyone concerned about competition and innovation in light of AMD's restructuring might take some solace if you read up on Joseph Schumpeter's theories of monopoly and innovation. In a nutshell, he said that monopolies can drive innovation because they have large amounts of capital which can be invested into novel research. As semiconductor design is a highly investment-driven industry, it's a perfect place for such a market dynamic.

Also, Intel's biggest competition these days is from its own legacy products. Unlike AT&T, Intel makes money by selling widgets, not services, so if they don't make new chips compelling enough to warrant an upgrade, then they don't generate revenue. I suspect that in a vacuum of viable competition we'll see the rate of improvements slow, but not by much. Intel still has to make vastly improved CPUs year over year to fuel growth, so unless Intel's shareholders want to settle for lower revenue and higher margin (and risk getting mauled on the low end by ARM/WOA), I don't see this changing.

Where AMD was useful was that they could keep Intel from solely dictating the direction of the market. Had AMD imploded a decade ago, I strongly suspect we'd all be running IA64 CPUs right now. Could be worse, really.

But here's a question: How many of you have greatly slowed the rates of your own CPU upgrades? I used to get a new computer/CPU every 2-3 years, but my current i7 920 @ 3.6Ghz has provided enough performance that I feel it'll be until Haswell (or its death) before I upgrade. Anyone else finding themselves in a similar boat?

freeforumuser
Aug 11, 2007


Beelzebubba9 posted:

Also, Intel's biggest competition these days is from its own legacy products. Unlike AT&T, Intel makes money by selling widgets, not services, so if they don't make new chips compelling enough to warrant an upgrade, then they don't generate revenue. I suspect that in a vacuum of viable competition we'll see the rate of improvements slow, but not by much. Intel still has to make vastly improved CPUs year over year to fuel growth, so unless Intel's shareholders want to settle for lower revenue and higher margin (and risk getting mauled on the low end by ARM/WOA), I don't see this changing.

Where AMD was useful was that they could keep Intel from solely dictating the direction of the market. Had AMD imploded a decade ago, I strongly suspect we'd all be running IA64 CPUs right now. Could be worse, really.

But here's a question: How many of you have greatly slowed the rates of your own CPU upgrades? I used to get a new computer/CPU every 2-3 years, but my current i7 920 @ 3.6Ghz has provided enough performance that I feel it'll be until Haswell (or its death) before I upgrade. Anyone else finding themselves in a similar boat?

My 2500K is going last a long time. From a strictly desktop POV IB is too lackluster to jump ship to.

It's not even just CPUs; AMD is making 7770s that are barely faster than its own 5770 almost 2.5 years ago it's as if they knew Nvidia won't try to one-up them in the future(even though GTX 560 is already waaaay better). Stagnation is here, folks.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

We're certainly in a period where technological capabilities have been improving much faster than software's ability to make use of them. One primary cause of this is how long it took to replace Windows XP, and Microsoft's choice to move very slowly on 64-bit support. There's really no excuse for Windows 7 having a 32-bit version, much less Windows 8, and the fact that there's such a huge legacy install base of 32-bit systems that continue to go into service means nobody is going to make an app that uses more than 2GB of RAM. Traditionally games have also advanced the state of the art, yet any game made today has to be able to run on a console with a Geforce 7 or Radeon X1900 (without unified shader support) and with only 512MB of system RAM, there's only so much you can scale for a PC port from that.

Additionally the fact that GPU acceleration fixed-function hardware acceleration is heavily used on desktops today dramatically reduces the CPU demands. That's not such a big deal on the high-end, but look how vastly more capable AMD Brazos systems are compared to Intel Atoms thanks to their integrated GPU that is capable of handling video decode and playback in hardware. You can easily watch 1080p Youtube videos on a Brazos netbook with plenty of CPU to spare, anything more than 360p is right out on an Atom (which is why the Netbook was obsoleted).

Another thing that was underscored to me today is how vastly more efficient processing has become. The best example is JavaScript execution performance in browsers, it used to be that JS executed too slow to do more than trivial things, but in current version of Firefox and Chrome the JS engines are fast enough to fulfill the empty promises of Java: cross-platform applications that are as usable as native code. The biggest hurdle right now is consistent latency, and improvements coming in Firefox 13 and 14 should remove the minor hitches and pauses that keep JS from feeling native. We also have amazing tools like Emscripten that can compile C++ to JS, and produce JS code that runs twice as fast as the best hand-ported code, which is multiplied by the speed improvements from new JS engines.

Alchenar
Apr 9, 2008

The level of betrayal I felt when Paradox announced their new wallpaper tore something from me that I'll never be able to recover. They tore away my ability to respect anything, and they tore away my ability to feel human.

On the other hand, over the next year Intel can let its marketing budget in the PC arena fall through the floor.

Alucardd
Aug 1, 2006


Beelzebubba9 posted:

Also, Intel's biggest competition these days is from its own legacy products. Unlike AT&T, Intel makes money by selling widgets, not services, so if they don't make new chips compelling enough to warrant an upgrade, then they don't generate revenue. I suspect that in a vacuum of viable competition we'll see the rate of improvements slow, but not by much. Intel still has to make vastly improved CPUs year over year to fuel growth, so unless Intel's shareholders want to settle for lower revenue and higher margin (and risk getting mauled on the low end by ARM/WOA), I don't see this changing.

This is what I was implying with in addition to. Sorry if I didn't mention this.

Longinus00 posted:

So you're saying we'll soon be leasing our CPUs from intel?



No, I am just disappointed at the lack of attention and funding of experimental R&D recently being shoved aside.

BEAR GRYLLZ
Jul 30, 2006

I have strong erections for Israel.
Strong, pathetic erections.



freeforumuser posted:

My 2500K is going last a long time. From a strictly desktop POV IB is too lackluster to jump ship to.

It's not even just CPUs; AMD is making 7770s that are barely faster than its own 5770 almost 2.5 years ago it's as if they knew Nvidia won't try to one-up them in the future(even though GTX 560 is already waaaay better). Stagnation is here, folks.

To be fair, stagnation in the video card market has more to do with TSMC dropping the ball on the entire 32nm node than AMD or Nvidia slowing down. We essentially lost an entire generation of video cards when TSMC hosed up, and so we ended up with weird stuff like AMD's 5800 series cards being just as powerful as their 6900 series, with only a slightly older architecture and a few less features to show for its age.

tijag
Aug 6, 2002


BEAR GRYLLZ posted:

To be fair, stagnation in the video card market has more to do with TSMC dropping the ball on the entire 32nm node than AMD or Nvidia slowing down. We essentially lost an entire generation of video cards when TSMC hosed up, and so we ended up with weird stuff like AMD's 5800 series cards being just as powerful as their 6900 series, with only a slightly older architecture and a few less features to show for its age.

The 5870 is slightly faster than the 6870. The 6970 is an upgrade from the 5870, maybe in your opinion it's not a big enough upgrade for you, but you can't claim that the 58xx is just as powerful as the 68xx.

In fact the 6930 [a heavily die harvested 69xx] that just launched is basically as fast as an hd5870, only its $170.

kapinga
Oct 12, 2005

I am not a number

freeforumuser posted:

My 2500K is going last a long time. From a strictly desktop POV IB is too lackluster to jump ship to.

It's not even just CPUs; AMD is making 7770s that are barely faster than its own 5770 almost 2.5 years ago it's as if they knew Nvidia won't try to one-up them in the future(even though GTX 560 is already waaaay better). Stagnation is here, folks.

Generational improvements that are so massive as to warrant upgrading from the immediately preceding generation are relatively rare, and usually are as much because of a flaw in the older generation than anything else (Pentium IV to Core 2, I'm looking at you). Much of the PC market upgrades every couple of years or more, so most manufacturers need to make sure that there's improvements worth buying every 2 years, not every 6-12 months. AMD would seem to be having trouble with that, in some cases. Intel, by pretty much every measure, is not. And IB has some big upsides for the increasingly important mobile market.

GPUs I'm not nearly as familiar with, but ultimately the premise is the same. And using the (supposedly) competitive GPU market as being a source of stagnation sort of undermines the whole argument about an Intel monopoly being a problem, specifically. If the GPU market is stagnating, the AMD's presence in the CPU market is irrelevant and it'll stagnate either way.

KillHour
Oct 28, 2007






kapinga posted:

Generational improvements that are so massive as to warrant upgrading from the immediately preceding generation are relatively rare, and usually are as much because of a flaw in the older generation than anything else (Pentium IV to Core 2, I'm looking at you). Much of the PC market upgrades every couple of years or more, so most manufacturers need to make sure that there's improvements worth buying every 2 years, not every 6-12 months. AMD would seem to be having trouble with that, in some cases. Intel, by pretty much every measure, is not. And IB has some big upsides for the increasingly important mobile market.

GPUs I'm not nearly as familiar with, but ultimately the premise is the same. And using the (supposedly) competitive GPU market as being a source of stagnation sort of undermines the whole argument about an Intel monopoly being a problem, specifically. If the GPU market is stagnating, the AMD's presence in the CPU market is irrelevant and it'll stagnate either way.

Oh God, this just reminded me how old my E6300 is. IB can't come soon enough.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Just to forestall any drama: Ivy Bridge has not been delayed. The Financial Times published an article that claims it was, and some tech sites are running with it, but this is a misunderstanding of the same information we already knew, that dual-core notebook processors were going to release in June. Everything else is still on track for the first week of April.

tijag
Aug 6, 2002


Alereon posted:

Just to forestall any drama: Ivy Bridge has not been delayed. The Financial Times published an article that claims it was, and some tech sites are running with it, but this is a misunderstanding of the same information we already knew, that dual-core notebook processors were going to release in June. Everything else is still on track for the first week of April.

Thank god. I saw this plastered across every forum and blog and read the article and the only thing that it specifically said was released were 'notebook' chips. the 2500k/2600k replacement needs to launch in April.

I'm holding off on upgrading a lot of things for that. I also hope that GK104 launches in that same window, and that it is within a stones throw of the 7950, but priced more like you would expect a 330-360mm2 chip should be priced. Possibly drive the price of the 79xx, 78xx and 77xx down to more affordable levels.

Doesn't look like that's going to happen though. What with nvidia complaining about the 28nm yields at TSMC, and all of the 'design wins' for mobile looking like they are just rebadged Fermi 40nm's. [no 28nm on any forcasted laptop.]

I wonder if Apple is going to dump nvidia again after nvidia once again is likely to fail to meet expectations set by apple. also, I'm wondering why Apple switch from AMD back to nvidia. Considering AMD has been executing very well for some time, what is it that nvidia showed/sold Apple on to convince them to switch back to nivida.

japtor
Oct 28, 2005
WELL ARNT I JUST MR. LA DE FUCKEN DA. oh yea and i suck cocks too


Apple probably goes with whoever is decent enough at whatever price point...otherwise Nvidia has their IGP/discrete switching tech, can AMD's stuff work the same way? That's the only techincal reason I can think of, Apple is crazy for anything that can save on battery life.

tijag
Aug 6, 2002


japtor posted:

Apple probably goes with whoever is decent enough at whatever price point...otherwise Nvidia has their IGP/discrete switching tech, can AMD's stuff work the same way? That's the only techincal reason I can think of, Apple is crazy for anything that can save on battery life.

Well, for the last 2+ years AMD's GPU's have been more power/perf efficient. So I can't imagine it was purely on that metric. If optimus made them switch, I suspect it won't be for long as it doesn't look like nvidia's going to have a 28nm mobile GPU anytime soon, so by default they will be at a huge disadvantage.

a mobile 7750 would be a pretty kickass GPU.

Disgustipated
Jul 28, 2003

Black metal ist krieg

tijag posted:

I wonder if Apple is going to dump nvidia again after nvidia once again is likely to fail to meet expectations set by apple. also, I'm wondering why Apple switch from AMD back to nvidia. Considering AMD has been executing very well for some time, what is it that nvidia showed/sold Apple on to convince them to switch back to nivida.
What do you mean dump nVidia again? All of Apple's currently shipping computers are using AMD or Intel GPUs.

tijag
Aug 6, 2002


I don't think it was that long ago that Apple used nvidia, then they switched to AMD. if nvidia isn't able to provide the promised performance, price, volume and performance/watt that they told apple they would be able to do, then most likely they will be dropped in favor of amd again.

movax
Aug 30, 2008



If I remember the datasheets correctly, both AMD and nvidia GPUs are game for graphics switching, and Intel's been on-board with it since Calpella at least. I wouldn't expect Apple to suddenly switch dGPU vendors this year.

tijag
Aug 6, 2002


Xbit reporting really hard that quad core i5's have been postponed.

http://www.xbitlabs.com/news/cpu/di...ailability.html

If this is the case, and the 'sweet spot' i7 launches any time in April I will most likely pay the extra $100 and buy that.

*sigh*

KillHour
Oct 28, 2007






Once again, that only applies to notebooks.

quote:

In his first interview to discuss Intel’s business in China, Mr Maloney told the Financial Times that the start of sales of machines equipped with Ivy Bridge – the 22nm processor set to succeed Sandy Bridge in notebooks this year – had been pushed back from April. “I think maybe it’s June now,” he said.

...

An Intel spokesperson said the company’s plans to start shipping Ivy Bridge in the second quarter had not changed.

http://www.xbitlabs.com/news/cpu/di...ailability.html

tijag
Aug 6, 2002


From that article. [which god knows if it's right or not, I don't trust x-bit, or the author very much]

"Mass availability of dual-core Core i5/i7 as well as desktop quad-core Core i5 central processing units will start from June 3, 2012."

Bold mine.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

tijag posted:

From that article. [which god knows if it's right or not, I don't trust x-bit, or the author very much]

"Mass availability of dual-core Core i5/i7 as well as desktop quad-core Core i5 central processing units will start from June 3, 2012."

Bold mine.
Yeah it's wrong, this is all basically a giant circular game of telephone with people repeating and embellishing some off the cuff remark by an Intel China exec who didn't really know what he was talking about. (Maybe I'm being mean to him here, but "I dunno, notebook processors in June or something maybe? Why? Uhh, well the 22nm process is hard maybe?" doesn't really sound like he was TRYING to speak authoritatively.)

tijag
Aug 6, 2002


Alereon posted:

Yeah it's wrong, this is all basically a giant circular game of telephone with people repeating and embellishing some off the cuff remark by an Intel China exec who didn't really know what he was talking about. (Maybe I'm being mean to him here, but "I dunno, notebook processors in June or something maybe? Why? Uhh, well the 22nm process is hard maybe?" doesn't really sound like he was TRYING to speak authoritatively.)

That would be great news for me. I was hoping for early April availability for the next gen i5-2500k.

wdarkk
Oct 26, 2007

Friends: Protected
World: Saved
Crablettes: Eaten


7GHZ overclock . I guess all that improved power efficiency means it's less likely to fry itself doing something like that.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down



wdarkk posted:

7GHZ overclock . I guess all that improved power efficiency means it's less likely to fry itself doing something like that.

Nope, at least as likely to fry itself. About 2/3 the size of the lithography of Sandy Bridge, power efficiency is way less salient than the lowered operating voltage safety limits that usually accompany process shrinks. That's operating MASSIVELY outside anything resembling safe parameters and should be considered a suicide run. It's "look how big ours is " from Intel, in (unnecessary but nonetheless understandable) answer to AMD's suicide run Zambizi world record clock frequency overclock.

Relevant bits that let you know that:

the article posted:

raising voltage to 1.889 Volts, using 63x multiplier and 112.11 MHz and using dry ice

That's a solid .5V higher than the safe voltage for Sandy Bridge, a 32nm part. The multiplier is hand-selected as hell, there is no way your average chip is going to come anywhere near 63x. BCLK of 112.11MHz is deeply unstable, holy poo poo, you can maybe nudge that to 103-105.0MHz if you're feeling lucky but tons of stuff in the system rely on a solid BCLK of 100.00 and it's asking for trouble raising it...

But the most telling part is that they used dry ice to cool it. That's a step away from using a supercooled liquid (which was required for the giant Bulldozer number if I recall correctly) and not at all sustainable.

They overclocked it to 7GHz long enough to get into the OS and save a screenshot. That's that. Bet that chip and the motherboard it was mounted to are broken, and who knows how many chips died along the way to trying for the "100% overclock!" magic number?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.


wdarkk posted:

7GHZ overclock . I guess all that improved power efficiency means it's less likely to fry itself doing something like that.



For reference: http://valid.canardpc.com/records.php

Factory Factory fucked around with this message at 18:45 on Feb 29, 2012

future ghost
Dec 5, 2005

det er noget at leve for

Gun Saliva

Agreed posted:

But the most telling part is that they used dry ice to cool it. That's a step away from using a supercooled liquid (which was required for the giant Bulldozer number if I recall correctly) and not at all sustainable.

They overclocked it to 7GHz long enough to get into the OS and save a screenshot. That's that. Bet that chip and the motherboard it was mounted to are broken, and who knows how many chips died along the way to trying for the "100% overclock!" magic number?
7GHz isn't really that impressive overall, since the link above points out that Nehalem chips scaled higher in earlier benching runs even. This test run doesn't really prove anything beyond that a hand-binned chip can scale with cooling. It's not a reflection on overclocking potential with standard cooling.


Now what would be interesting is if they drop one under liquid helium, which is what most of the high BD runs were achieved under IIRC. Even LN2 results would be neat to check out.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast


Honestly, what would be interesting is if they simply had a high success rate at 5GHz on reasonably quiet air with reasonable voltage. Which I imagine is possible, seeing as its an improvement all round on SB on a smaller process. But it'd still be nice to see.

freeforumuser
Aug 11, 2007


3770K Chinese (leaked?) preview.

http://news.mydrivers.com/1/218/218443_9.htm

Stock clocks only 18W less than 2600K at load. Performance pretty much unnoticable from 2600K in reality.

5GHz OC at 1.27V. Impressive by SB standards, but it might be a bit too high of a voltage for 22nm.

KillHour
Oct 28, 2007






There appears to have been a leak of all the finalized SKUs.

http://hexus.net/tech/news/cpu/3604...vy-bridge-line/

Don't know how accurate it is, though.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

The Intel Xeon E5-series, Sandy Bridge EP for dual-socket servers, is now available and reviewed on Anandtech. I'm not sure if you can drop these into desktop LGA2011 boards, but I would assume you could, which might be a way to get 8 real cores on the desktop if that's something you actually need and don't mind spending more than $1400 on a CPU.

Edit: Holy poo poo these are absolutely amazing server processors, and they eliminate some of the Opteron's most significant platform advantages, such as LR-DIMM support.

Alereon fucked around with this message at 21:05 on Mar 6, 2012

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.




So, everyone's caught up in Ivy Bridge but no discussion at all about the fact that the Sandy Bridge Xeon E5-2600 series was released today?

Anandtech has a pretty good writeup.

http://www.anandtech.com/show/5553/...dge-for-servers

Pretty drat impressive if you ask me.

Edit: Hell, that's what I get for getting distracted before I hit the reply button.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams


Mmm, please put these in some Dell servers yesterday so I can buy them.

Or at least get started now so when the time comes I can get these for our virtualization push.

Adbot
ADBOT LOVES YOU

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.




FISHMANPET posted:

Mmm, please put these in some Dell servers yesterday so I can buy them.

Or at least get started now so when the time comes I can get these for our virtualization push.

The first wave of the 12G servers are live now. You can order the R620, R720, and R720xd with them.

Speaking of which, I'm really liking the increased drive flexibility these things have over their predecessors.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply