Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
WhyteRyce
Dec 30, 2001

The performance of that onboard video encoder is very surprising. I had no intention of ever using it but now it looks like it could be a nice added bonus

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Interesting tidbit for those curious about boot times on EFI: Anandtech reports that their Intel P67 board cut POST times by a quarter versus Intel P57 and X58 boards, from about 29 seconds to about 22 seconds.

Star War Sex Parrot
Oct 2, 2003

Alereon posted:

Interesting tidbit for those curious about boot times on EFI: Anandtech reports that their Intel P67 board cut POST times by a quarter versus Intel P57 and X58 boards, from about 29 seconds to about 22 seconds.
Considering how fast my Apple computers boot, I expect great things from PC adoption of EFI. I know we have ~40 EFI motherboards at work but I've never actually toyed with them to see if they boot faster.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

WhyteRyce posted:

The performance of that onboard video encoder is very surprising. I had no intention of ever using it but now it looks like it could be a nice added bonus
There was a pretty lovely footnote to this, Intel Quick Sync transcode technology only works if the on-die graphics is enabled and in-use. This means those of us with P67 boards or discrete graphics cards can't use use the video transcoder, which just smacks of a retarded implementation.

WhyteRyce
Dec 30, 2001

Alereon posted:

There was a pretty lovely footnote to this, Intel Quick Sync transcode technology only works if the on-die graphics is enabled and in-use. This means those of us with P67 boards or discrete graphics cards can't use use the video transcoder, which just smacks of a retarded implementation.

Oh yeah I forgot about that. Never mind then!

talk show ghost
Mar 2, 2006

by Ozma

Alereon posted:

There was a pretty lovely footnote to this, Intel Quick Sync transcode technology only works if the on-die graphics is enabled and in-use. This means those of us with P67 boards or discrete graphics cards can't use use the video transcoder, which just smacks of a retarded implementation.

Couldn't this be solved by clever motherboard design having both the discrete and integrated in use but only use the discrete to actually display anything?

WhyteRyce
Dec 30, 2001

Wow, 4.4Ghz on that lovely low-profile stock heatsink is pretty drat impressive.

Spite
Jul 27, 2001

Small chance of that...

talk show ghost posted:

Couldn't this be solved by clever motherboard design having both the discrete and integrated in use but only use the discrete to actually display anything?

It's not that simple. You'd have to also be aware there are 2 GPUs and data would have to flow well from one to the other. There's a host of annoyances. Similar to why no one has a good solution for dynamically switching between an integrated GPU and a discrete one on the fly based on workload - you have to be able to assume the rest of the system is playing nice, which it most definitely is not.

zachol
Feb 13, 2009

Once per turn, you can Tribute 1 WATER monster you control (except this card) to Special Summon 1 WATER monster from your hand. The monster Special Summoned by this effect is destroyed if "Raging Eria" is removed from your side of the field.
So if I don't intend to use the integrated graphics at all (and I'm using a P67/other board with no video), that part of the chip will be doing nothing?

What the heck? That seems really stupid.

big shtick energy
May 27, 2004


Spite posted:

Similar to why no one has a good solution for dynamically switching between an integrated GPU and a discrete one on the fly based on workload - you have to be able to assume the rest of the system is playing nice, which it most definitely is not.

Don't some of the macbooks do exactly this?

Spite
Jul 27, 2001

Small chance of that...

DuckConference posted:

Don't some of the macbooks do exactly this?

Not quite. The most recent Macbook pros will switch from an integrated intel part to the discrete nvidia chip. System apps that are "aware" will run integrated until a switch occurs. But they have to be coded with the assumption that their graphics context can and will be yanked out from underneath them at any time. Apps that aren't aware will always power up the discrete part, even if what they are doing doesn't need that power. For example, a simple Core Animation app will switch EVERYTHING over to discrete. It's a whitelist, not based on computational need.

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
Are they really boxing the Extreme cooler with the K chips.

Goodbye V8 :smith:

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Spite posted:

It's not that simple. You'd have to also be aware there are 2 GPUs and data would have to flow well from one to the other. There's a host of annoyances. Similar to why no one has a good solution for dynamically switching between an integrated GPU and a discrete one on the fly based on workload - you have to be able to assume the rest of the system is playing nice, which it most definitely is not.
nVidia Optimus is pretty seamless on laptops, and LucidLogix supposedly has a similar brand-agnostic solution ready for desktops using the H55 chipset.

R1CH
Apr 7, 2002

The Ron Jeremy of the coding world
Sandy Bridge is the biggest disapointment (sic) of the year 2 days in. Oh Charlie. I like how the "showstopper bug" is booting from USB3 isn't stable.

Also OpenGL is slow when running in software emulation.


How sure is the Jan 5th release? Overclockers is still claiming Jan 9th.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Honestly I am kinda disappointed in desktop Sandy Bridge. The absence of VT-d and TXT on the K-series isn't really significant, but it feels kind of insulting to cut features out of a premium SKU. The fact that you only get Intel HD Graphics 3000 on the K-series CPUs that are least likely to use it is also odd. It's also bullshit that we can't use Quick Sync without using the on-die graphics, I don't see what keeps them from running the silicon even if it's not driving a display. Given the Quick Sync limitation, the feature division between chipsets (P67 gets dual graphics and overclocking, H67 gets Intel HD Graphics and Quick Sync) is even more frustrating.

On the plus side, Sandy Bridge appears to have more than delivered for the mobile sector. Sandy Bridge laptops are insanely fast and have great battery life, and Intel HD Graphics 3000 performs competitively with the Geforce GT 325M, which is pretty drat good for integrated graphics. Paired with a Geforce GTX 560M you'd have a pretty drat efficient mobile gaming system.

Alereon fucked around with this message at 18:20 on Jan 3, 2011

Metanaut
Oct 9, 2006

Honey it's tight like that.
College Slice

incoherent posted:

Are they really boxing the Extreme cooler with the K chips.

Goodbye V8 :smith:

So K's come with the boxy cooler, like seen here : http://techreport.com/articles.x/20188/4 ?

I got my eyes on a 2500K and was wondering if it's worth buying a fancier cooler. Although I'm more concerned about the noise level and most of them seem to be louder than at least the old stock Intel one, I haven't seen any reviews saying anything about this one.

Srebrenica Surprise
Aug 23, 2008

"L-O-V-E's just another word I never learned to pronounce."
Anandtech says the stock K one is the low-profile cooler a couple times.

Whimsy
Jan 8, 2001
I was excited until I found out the 2500K can't be overclocked on a board that actually supports the on-board graphics. The only games I play are older games that wouldn't use anything that powerful, and I was really hoping to reduce my build costs by saving on the PSU and graphics card, while having head-room to play with in terms of clock speed.

I'm disappointed. I guess I'll wait until March to see how the market changes. I was hoping to build something sooner rather than later, but I'm not in a rush.

MTW
Dec 30, 2004

by angerbot
Intel charging X dollars for the ability to overclock is downright evil. I say that fully aware that by buying a new processor (it's time for me to do so) and board I am supporting this.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I'm glad that Anandtech mentioned it. I want/need the IOMMU for virtualization purposes. So I'm going to avoid the K-series like the plague. i7-2600 it is.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
Are the K chips still going to be price-gouged the first few weeks?

devmd01
Mar 7, 2006

Elektronik
Supersonik

R1CH posted:

Sandy Bridge is the biggest disapointment (sic) of the year 2 days in. Oh Charlie. I like how the "showstopper bug" is booting from USB3 isn't stable.


Thanks for that laugh, it's been a busy morning. Linux doesn't work quite right on bleeding-edge hardware? YOU DON'T SAY!?!?!

movax
Aug 30, 2008

Props to Anand, we've had SB CPUs here for awhile now, and I never realized *Ks didn't have VT-d. I don't virtualize quite enough for that to drive me away, but like someone said, it's kind of a downer that the premium SKU (at least for now) doesn't have all the bell and whistles. My reasoning (guess) is that in QA testing, that functionality went straight to hell above certain clocks, and rather than have users enable/disable it in BIOS, they just killed it off. (Either that, or it can't co-habitate with the HD 3000 graphics, which I think is unlikely).

Also, offering up the 2600K and saying "yeah, you could get above 5GHz with this :smug:", and then basically requiring a chipset (P67) that doesn't implement FDI...why make us pay for that HD3000? Maybe they just expect people to wait for the Z68. Maybe it was a chip packaging decision. I wouldn't mind being able to use the integrated GPU to run my 3rd display, it'd leave me PCIe slots free.

HW transcoding and acceleration is nice on paper, as it always is, but dealing with being on the cutting edge of multimedia tech, I've been taking the approach of just throwing CPU brawn at decoding to avoid headaches. Colorspace conversion errors, incompatibility between output renderers and combination of DShow filters, asinine limitations for HW encoders...good thing there's a CPU underneath all of this that can actually earn its keep.

Looks like a home-run for the mobile market, AMD is continuing to get owned there. Interestingly enough, I guess Intel is (like they have been) content to leave the "value" market to AMD.

e: spasticColon, yes, I hope you like the feeling of Newegg's capitalist phallus in you! I will probably take it happily in return for a 2600K

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
The only difference between P67 and H67 is whether there are hookups for video hardware, right? So why does the vanilla Asus P8H67 have no display connectors? As far as I can tell, it's just a P8P67 with a shitter audio codec.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Factory Factory posted:

The only difference between P67 and H67 is whether there are hookups for video hardware, right? So why does the vanilla Asus P8H67 have no display connectors? As far as I can tell, it's just a P8P67 with a shitter audio codec.
H67 doesn't support PCI-Express port bifurcation (dividing the x16 slot into two x8 slots) or overclocking. It's probably also less expensive than the P67.

movax
Aug 30, 2008

Alereon posted:

H67 doesn't support PCI-Express port bifurcation (dividing the x16 slot into two x8 slots) or overclocking. It's probably also less expensive than the P67.

It looks like the spec page says it supports some uber quad-GPU Radeon config; maybe they repurposed the FDI link? Though, IIRC, electrically FDI is very similar to DisplayPort, so maybe it is just the cost issue.

movax
Aug 30, 2008

Also, Asus saw it fit to make me dig through their product pages to figure poo poo out (their product comparator was borked when I tried it), and then I saw legit reviews put up a nice spec table: http://www.legitreviews.com/article/1500/1/

Happy:
- all have USB 3.0
- all have Firewire
- all have SATA 6Gbps
- P8P67 PRO and above implement a PHY for the integrated MAC in the P67
- P8P67 is only $160.00...so, $200 at the egg?

Sad:
- need to get the PRO or better for SLI
- P8P67 doesn't use the Intel ethernet controller
- P8P67 is starved for lanes :(
- no legacy 775 holes :(

Also, don't buy the LE. Personally I think I will go for the PRO, because I don't SLI, but want the Intel ethernet solution.

Happy to see that pretty EFI implementation. I guess the Asus engineers that were sleeping all the time during training could learn in their sleep.

e: In case anyone was curious about the integrated Intel Ethernet functionality, it's similar to what was in the Q57. The chipset provides an integrated 10/100/1000 MAC. This MAC is useless without an accompanying PHY however, which as its name suggests interfaces with the physical ethernet network. Apparently, it's cheaper for some makers to buy a Realtek controller (MAC+PHY) and use that compared to buying just the Intel PHY. If you don't use the Intel PHY however, I think you can repurpose that PCIe x1 link for something else...like the aforementioned Realtek chip.

e2: And now I know why the PEX8608 is backordered, thanks Asus! :argh:

I think everyone should seriously consider the "splurge" for getting the Intel solution; your torrents will thank you.

movax fucked around with this message at 19:26 on Jan 3, 2011

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

movax posted:

I think everyone should seriously consider the "splurge" for getting the Intel solution; your torrents will thank you.

If we don't torrent, should we care?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Even if you do torrent, you're almost certainly not going to care about your NIC chipset. GigE has been common for long enough that even your standard Realtek or Marvell chipset has no problems making it work well.

greasyhands
Oct 28, 2006

Best quality posts,
freshly delivered

movax posted:

Props to Anand, we've had SB CPUs here for awhile now, and I never realized *Ks didn't have VT-d. I don't virtualize quite enough for that to drive me away, but like someone said, it's kind of a downer that the premium SKU (at least for now) doesn't have all the bell and whistles. My reasoning (guess) is that in QA testing, that functionality went straight to hell above certain clocks, and rather than have users enable/disable it in BIOS, they just killed it off. (Either that, or it can't co-habitate with the HD 3000 graphics, which I think is unlikely).



Yeah right, it's product differentiation.

movax
Aug 30, 2008

Alereon posted:

Even if you do torrent, you're almost certainly not going to care about your NIC chipset. GigE has been common for long enough that even your standard Realtek or Marvell chipset has no problems making it work well.

I know that Broadcom and Atheros aren't up to snuff compared to Intel's offering, and my experiences with Realtek haven't given me the most favorable opinion. Granted, this can be due to the implementation of the controller IC (or drivers!), but Dell had R610s available with Broadcom NICs that would randomly kill off Solaris.

quote:

Yeah right, it's product differentiation.
It's the priciest SKU though? Or maybe HP/Dell buying boatloads of 2600s versus 2600Ks is better for them given manufacturing yields?

JawnV6
Jul 4, 2004

So hot ...

Combat Pretzel posted:

Trying to figure out what VMCS is, I ran over VirtualBox documentation that suggests that it's a feature available on all Intel CPUs with VT-x.

The Virtual Machine Control Structure (VMCS) is explained in the Intel PRM Vol. 3B, if that's not quite enough to put you to sleep you can read the rest of the PRM. It's a 4kb region of memory containing guest state, host state, control bits, etc. accessed through the vmread/vmwrite instructions. Basically it's an implementation detail of VT-x and unless you're rolling your own hypervisor you shouldn't care about it.

If any of the other SNB vets in FM want to meet up for lunch or at least to awkwardly stare at each other's shoes, hit me up on PM.

Raptop
Sep 3, 2004
not queer for western digital

JawnV6 posted:


If any of the other SNB vets in FM want to meet up for lunch or at least to awkwardly stare at each other's shoes, hit me up on PM.

My spirit will be at hotdog island.

movax
Aug 30, 2008

JawnV6 posted:

The Virtual Machine Control Structure (VMCS) is explained in the Intel PRM Vol. 3B, if that's not quite enough to put you to sleep you can read the rest of the PRM. It's a 4kb region of memory containing guest state, host state, control bits, etc. accessed through the vmread/vmwrite instructions. Basically it's an implementation detail of VT-x and unless you're rolling your own hypervisor you shouldn't care about it.

God my PRMs are so old. Ordering some shiny new ones today.

I assume that you could possibly answer this: when will the Intel product pages for their 6 Series Chipsets go up? After/during CES?

redeyes
Sep 14, 2002

by Fluffdaddy
I absolutely love Intel. I buy their stuff constantly BUT I would never trust their driver team to be able to deliver stable and feature packed graphics drivers for this new chip. As of right now Intel has never been able to deliver a competitive graphics chip/driver. According to a lot of reviews this SB cannot decode FILM correctly and outputs 24fps which makes movie watching jittery. That is loving unacceptable. I mean all this hub hub over what amounts to a lovely entry level graphics chip that lacks features and who knows what else? No loving thanks.

also, Marvel Yukon PCI-E Gig nics loving rock. You can get a nice Rosewill one from newegg for 25ish bucks.

WhyteRyce
Dec 30, 2001

Raptop posted:

My spirit will be at hotdog island.

Hotdog Island is gone man :cry:

JawnV6 posted:


If any of the other SNB vets in FM want to meet up for lunch or at least to awkwardly stare at each other's shoes, hit me up on PM.

Does CPT count? We crash your celebrations anyway.

greasyhands
Oct 28, 2006

Best quality posts,
freshly delivered

movax posted:


It's the priciest SKU though? Or maybe HP/Dell buying boatloads of 2600s versus 2600Ks is better for them given manufacturing yields?

It creates a separate product for the gamer/enthusiast market that enterprise can't use. No doubt they don't want to offer the K models at all but are worried that completely cutting off overclocking will give AMD an advantage, so they have pigeonholed it with the K models- removing virtualization and playing these ridiculous chipset games.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

redeyes posted:

I absolutely love Intel. I buy their stuff constantly BUT I would never trust their driver team to be able to deliver stable and feature packed graphics drivers for this new chip. As of right now Intel has never been able to deliver a competitive graphics chip/driver. According to a lot of reviews this SB cannot decode FILM correctly and outputs 24fps which makes movie watching jittery. That is loving unacceptable. I mean all this hub hub over what amounts to a lovely entry level graphics chip that lacks features and who knows what else? No loving thanks.
The 23.976fps thing seems like much ado over nothing. I watch my 23.976fps movies on a 60Hz LCD, which tells you how many fucks I give about judder. I can't imagine being sensitive enough that a doubled frame every 40 seconds bothered me. I mean clearly some people care so it would have been nice for them to support it, but I can't muster even mild annoyance.

movax
Aug 30, 2008

redeyes posted:

According to a lot of reviews this SB cannot decode FILM correctly and outputs 24fps which makes movie watching jittery. That is loving unacceptable.

1. It's a good thing there's a CPU included with SB that can decode video purely in software :smug:
2. It's been nearly half a century and everyone still has to deal with framerate fuckery (and they still get it wrong). Though screaming about 23.976 vs 24.000 is a new one to me.

I wonder if the DisplayPort jitter issue that was present on Ibex Peak snuck its way into the 6 Series from all the design reuse?

Adbot
ADBOT LOVES YOU

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.
So being that the 2500k costs Intel $216, how much will it likely sell for on a site like say Newegg?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply