Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Josh Lyman
May 24, 2009


Phone posted:

Eh, I think that the main points for Ivy Bridge were to get their 22nm process working, practical tri-gate transistors, and to lower the overall thermal profile. I think that Ivy Bridge is going to be the "look ma, it works!" in the grand scheme of things, but it's still going to be able to hold its own.
As a laptop-only user, Ivy Bridge is meant for me, not you. I've been on a Core2Duo/GeForce 8400 machine for 4 1/2 years now and Ivy Bridge can't come soon enough. :negative:

edit: Since my post, I'm now seriously considered abandoning the laptop-only model and going 2500K/HD 6850 + dual core Ivy Bridge ultrabook. That'll be just under $2000, which is about my upper limit for a decked out Ivy Bridge laptop. Ugh, decisions.

Josh Lyman fucked around with this message at 15:25 on May 13, 2012

Adbot
ADBOT LOVES YOU

Josh Lyman
May 24, 2009


For anyone who's interested, I just put together a 3570K system and it plays Starcraft 2 at 1920x1080 on high with no problems. Really impressed with the HD 4000.

Josh Lyman
May 24, 2009


Alereon posted:

DRAM, not SRAM, it's a 128MB DDR3 die connected to the CPU over a 512-bit bus via a silicon interposer (a slice of silicon that both the CPU die and DRAM die are bonded to, that is then bonded to the substrate). It would be way too expensive to fit 128MB of SRAM to a CPU.
So the CPU packaging will have a DRAM chip sitting next to the actual CPU core?

Josh Lyman
May 24, 2009


flavor posted:

There is no situation in the English language where "off of" is correct, and using it is a sure sign of the author not being a professional writer, which puts the research in question, at least a little bit. That's what I meant.

That being said, I've checked out some different sources, and even more than a week back there was talk of that Haswell refresh already, so I guess it's true. My apologies.
Exactly. "Based on" is the phrase. "Based off" is a malpropism.

Josh Lyman
May 24, 2009


Factory Factory posted:

There are no more rumors about the soldering, here's the Word of Intel: Tocks get socks.

Haswell, Skylake, whatever's next - socketed.

Broadwell, Skymont, whatever's after whatever's next - BGA only.

Subject to change, since apparently the tick-tock cadence is getting a hiccup with Broadwell. We're getting Haswell Refresh without a process node switch, apparently due to troubles with 14nm.
Wait what? That would mean every chip switches between sockets and BGA. That's 110% :psyduck:

Josh Lyman
May 24, 2009


SRQ posted:

Apple is trying to get iOS apps to be 64 bit now, could this be related to Intel having better low power offerings?
I don't see how but my brain finds patterns in dumb things, would this be competitive with the A7?
I believe Intel's mobile chips are 32 bit, and besides, it's unlikely Apple will switch away from custom silicon. Remember, an iPhone 5 gives you a full day of usage with less than half the capacity of a AA battery.

Josh Lyman
May 24, 2009


Install Windows posted:

An alkaline AA battery holds 8100 joules. The iPhone 5 battery holds 19,260 joules (the 4s and 4 were 19,000 joules).
I was referencing the iPhone 5 have a 1440 mAh battery and a AA having 2700 mAh.

edit: v Right right, Ah is not a measure of energy.

Josh Lyman fucked around with this message at 03:55 on Sep 12, 2013

Josh Lyman
May 24, 2009


KingEup posted:

Can anyone give me the dimensions of the stock heatsink/fan that ships with the new Haswell CPUs? I actually just need the height of the thing.

I'm trying to figure out how tall it sits off the motherboard and I can't find this info using my google skills. From the photos it looks like about 6cm tall:


You really shouldn't use the stock HSF. Technically it will keep your CPU from burning up, but even a $10 HSF from Newegg will do much, MUCH better.

edit: Oops, this advice is for Ivy Bridge. Don't know if it still applies to Haswell, but it's sound nonetheless.

Josh Lyman
May 24, 2009


Alereon posted:

Broadwell, the 14nm shrink of Haswell, has been delayed for at least one quarter due to yield issues with the 14nm process. Originally it was planned for initial production late this year for a launch in 2014, but now may be pushed back until late 2014 or even 2015. Note that Broadwell is a "tick", meaning it will be offered in as a multi-chip module that is soldered onto motherboards, not a socketed processor ("tocks get socks"). The Haswell refresh next year should tie us over until Skylake in 2015.
I'm sure this is a dumb question, but Ivy Bridge was a tick and came in LGA 1155. :confused:

Josh Lyman
May 24, 2009


Factory Factory posted:

It's a new policy with Haswell.
If they plan to only sell a CPU integrated into the motherboard, what are desktop DIYers supposed to do until the next tock comes out?

Josh Lyman
May 24, 2009


Lolcano Eruption posted:

I guess it won't be too bad. We already do this already with k processor: z87 board and non-k : h87 board. We only want the illusion of choice, but we just make the same picks anyways.
I would agree with you, but look at all the awful laptop configurations you see at Best Buy.

My Ivy Brige setup is a 3570K and ASRock Z77 Extreme4. Assuming ASRock would sell a Z97 or whatever motherboard with an embedded processor, what are the chances something feature-rich like the Extreme 4 wouldn't only come with a more expensive CPU like the 3770K?

Also, I got my CPU + Mobo for like $270 combine due to an insane bundling deal.

Josh Lyman
May 24, 2009


Agreed posted:

The enthusiast sector could use a break anyway, I'm hoping that engineering toward efficiency kills overclocking dead for good at some point in the near future because there just isn't any headroom to be had on any variation of current processes. It's too much hassle for too little reward, and they provide solutions that are appropriate to various goals at understandable price points. If overclockers get left out in the cold, and I count, I've got a shitload of 200mm case fans and a big, three-fan NH-D14 and all that jazz, fine, gently caress it, I'll come inside where it's warm and stop wasting money to wrestle performance out of parts that only have the overhead sometimes and be happier to buy fully warrantied and validated chips with higher core counts, etc., for future projects that call for more processing power than can be supplied by a stock configuration.
I don't think that's the tradeoff we're facing. The choice seems to be a 3.4GHz part that might run at 4GHz vs. a part that just runs at 3.4GHz.

I will say, however, that my 3570K + 16GB + SSD config doesn't really leave me wanting on the general use front. User experience improvements, as they have been for the last half decade, come almost entirely from the GPU and sweet, sweet 27" monitors.

Josh Lyman fucked around with this message at 10:16 on Oct 19, 2013

Josh Lyman
May 24, 2009


HalloKitty posted:

I disagree to some extent, I think the biggest improvement in experience in the last 5 years has been the wider use of SSDs, by a long, long way.

I'm still running a 6970 (unlocked 6950) and I can't find a good reason to upgrade, even though it's a great time to piss away money on graphics cards, and my main monitor is a U2410, which is probably a four year old model now.
I guess I meant that AFTER the SSD, improvements come from the GPU. Once you switch from HDD to SSD, there's isn't much improvement left on that front, but you can always upgrade your video card.

But yeah, my SSD desktop made my HDD laptop unbearable to use.

Josh Lyman
May 24, 2009


I'm catching up on the thread and saw all the G3258 excitement over the summer. Is it basically an HTPC special? I'm running a 3570K so I can't imagine it's much of a replacement for a desktop PC.

Josh Lyman
May 24, 2009


GokieKS posted:

It's obviously not an upgrade to the 3570K (really, nothing is a meaningful upgrade over that at this point), but it's a great option for anything that doesn't require more than 2 cores, especially if you're comfortable with overclocking. HTPC, normal desktop use, even gaming, though recent games that don't work properly on systems that doesn't support 4 threads (e.g. Dragon Age) has put a bit of a damper on that.
That's what I figured. That Microcenter bundle with the MSI motherboard is now $99 which certainly isn't bad.

Josh Lyman fucked around with this message at 04:57 on Dec 24, 2014

Josh Lyman
May 24, 2009


1gnoirents posted:

I have a box of old wafers in my closet. I wonder how much they were worth when they came off the crayon
I took home two 300m wafers after an internship at a semiconductor firm. One of them fell off a shelf a decade back, but the other one is still safe.

I think. It's been inside a black plastic container all these years.

Josh Lyman
May 24, 2009


Darkpriest667 posted:

it's a .3V difference. It's less than 5 watts for 4 Dimms. That's not enough of an efficiency for the entire consumer market to be switched to a new standard. Hell, I fold 247 and even for me it's not enough of an efficiency to switch standards. For servers it is a big deal because we're talking about datacenters that have 10000 DIMMs in them.
Is it conceivable that we'd see DDR4 primarily in servers and DDR3 for consumers? My guess is the memory manufacturers would prefer to manufacture only one or the other, but that doesn't mean they can't manufacture both.

From what I can tell, there won't be a DDR5, which means you'll be able to use your DDR4 from Skylake for years after as the industry tries to figure out a successor.

Josh Lyman
May 24, 2009


BIG HEADLINE posted:

It doesn't stop at the bundle deals - if you go in there with a PC Part Picker printout and let them know you're building a new computer, they'll almost always cut you a break for buying a lot of poo poo from them instead of from Newegg/Amazon. The downside is tax, and the more you get from them, the more likely they are to 'suggest' their product protection plan to you as an incentive to make their particular store more money. The only reason I can think of as to why MC is able to outprice Newegg/etc. is that they still build their own in-house systems with the Powerspec line.
I really don't understand how Microcenter is still in business.

Josh Lyman
May 24, 2009


PC LOAD LETTER posted:

Crap like that is why I dislike LGA's to this day. At least if you bent a pin on a P4 or AMD chip you could just use a mechanical pencil with the lead removed to carefully bend it back in place. Worked most every time.
Was the decision mainly to move pin costs to the motherboard manufacturers?

Josh Lyman
May 24, 2009


Haeleus posted:

Given I have a 2600k at 4.3Ghz
Just to be clear, when people say this, they're referring to the CPU being able to run at that multiplier under full load but it's not actually running at that speed all the time, right?

Josh Lyman
May 24, 2009


Thanks guys. I feel much better about only pushing my 3570K to 4.2GHz. :smith:

Josh Lyman
May 24, 2009


Twerk from Home posted:

I'd assume those are cherry picked benchmarks to extremely specific workloads. It confirms what we expected, Skylake's biggest gains will be in the iGPU and battery life, exactly like Ivy Bridge, Haswell, and Broadwell.
Welp, gonna have my Ivy Bridge 3570K until I die.

Josh Lyman
May 24, 2009


mayodreams posted:

Just doing a simple GIS for video card box art delivers.


Oh man, the first video card I ever bought was an ASUS GeForce 256 when I built my first computer in 2000. I think it might have been the V6600 Deluxe because it came with 3D glasses (that I used once).

Josh Lyman fucked around with this message at 06:51 on Aug 5, 2015

Josh Lyman
May 24, 2009


Khorne posted:

This is kinda disappointing. I've had an i7 3770k (ivy bridge) since it came out in 2012 and I was looking to maybe upgrade in early 2016. It looks like I am going to be sitting on this system (i7 3770k / 16gb of ram / gtx 670) for an eternity. Judging by the road map, probably until 2017 at the earliest and 2018 at the latest. That's nuts. I think the only other system I had for a duration similar to this was an E8400. That thing can still run lots of games coming out today at 60+ fps.
I've have a 3570K Ivy Bridge since May 2012 and I've just kind of resigned myself to the fact I won't be upgrading my CPU until my PSU is replaced i.e. every 5 years when the warranty expires.

Upgrading your GPU to a 970 will be a significant performance increase.

Josh Lyman
May 24, 2009


Both AnandTech and PCPer claim that Sandy Bridge owners should upgrade. I don't really see the compelling evidence, according to their own separate benchmarks, unless you're doing stuff like video compression or 3D modeling.

Palladium posted:

It's quite amazing that used 2500Ks still manages to go at ~$150 on Ebay. That's how bad (or good) the current CPU landscape is.
More like $110: http://www.ebay.com/sch/i.html?_from=R40&_sacat=0&_nkw=2500k%20processor&LH_Complete=1&LH_Sold=1

Josh Lyman fucked around with this message at 04:11 on Aug 6, 2015

Josh Lyman
May 24, 2009


Beautiful Ninja posted:

From an i3 Sandy Bridge you should get a pretty drat nice performance boost. Moving up to 4 proper cores from 2 is a big boost in stuff like games which don't get much performance benefit from HT in the first place and the significant improvements in IPC and power consumption since then. The 2500/2600k are still good mostly on the back of their extreme overclocking potential. If I remember right from Sandy Bridge to Broadwell and Skylake is close to a 40% gain in performance per mhz, but generally at the cost of higher mhz since newer chips don't OC as well. Since you're on an i3 you'd also be getting a core clock boost on top of an IPC boost.
Skylake is 25% better IPC than Sandy Bridge.

Josh Lyman
May 24, 2009


Don Lapre posted:

I had seen some people say the smaller dies of the latest chips is also a reason?
Smaller dies shouldn't preclude soldering the heatspreader.

Josh Lyman
May 24, 2009


Panty Saluter posted:

Ah, computers and video games....where a 6 can feel like a 10
To 13 to 29-year-old computer nerds, that cover is a honeypot.

Josh Lyman
May 24, 2009


Don Lapre posted:

Crazy rear end new intel HS/F








Oh hey, it's a HSF from 2004 :v:

Josh Lyman fucked around with this message at 21:39 on Aug 14, 2015

Josh Lyman
May 24, 2009


Considering the Apple Watch has wireless charging, I wonder how long it'll be before the iPhone has it. Maybe the 6S?

Josh Lyman
May 24, 2009


DrDork posted:

Yeah, all the ones with the Iris Pro 6200 should, in theory, have the L4 cache; I meant more compared to the other desktop processors, since now you're talking about a different socket. The laptop versions are also a good bit more expensive: the 5950HQ (2.9GHz base/3.7GHz turbo) lists for $623, compares to the 5775C (3.3/3.7GHz) at $366.
Different packaging is a lot easier to manage than a different die though.

Josh Lyman
May 24, 2009


He probably means vs iPads and other "devices"

Josh Lyman
May 24, 2009


pmchem posted:

mid-Q4 for Kaby Lake desktop chips, so sad :(
That means Q2 2017 for wide availability and sales.

My 3570K + mobo will be 5 years old at that point.

Josh Lyman
May 24, 2009


Rukus posted:

Artificial market segmentation is the reason for the whole VT-d thing on K processors. If you could get a K i7, have VT-d, and overclock the hell out of it then you're basically at some of their Xeons in terms of performance.
VT-x is sufficient for most people running workstation VMs as opposed to servers, right?

Josh Lyman
May 24, 2009


Eletriarnation posted:

Coming up on 12 year, actually. It's an Alienware desktop from mid-2004 with an ASUS Socket 478 motherboard and a CT-479 adapter to be able to run a Pentium M and overclock it with desktop-class cooling. My first real gaming desktop and quite nice in its day, only replaced at the end of 2008 when Nehalem came out.

I'm not concerned about it because it's not my primary system, that's a 2500K, but it still works great so I put Windows 10 on it to see how it would hold up. It's not bad with 4GB of memory, I could use it in a bind if all my other desktops somehow broke. Hardware decoding for some video works on the 4650 that I found for $25 on eBay, but anything that has to use software decoding will run like poo poo on a single-core processor.

The more impressive thing to me is that the original 80GB Seagate 7200.7 SATA drive attached to it still works perfectly with no bad sectors. Dog slow compared to a new drive but I can't complain about longevity.
Classic hoarder behavior.

Josh Lyman
May 24, 2009


I need Kaby Lake to get here already so that I can get a new laptop.

Josh Lyman
May 24, 2009


Tab8715 posted:

I feel like I won't be upgrading Sandy Bridge until Cannonlake.
Yeah as far as my desktop goes, I don't see any reason to upgrade my 3570K until it dies.

Josh Lyman
May 24, 2009


Boris Galerkin posted:

Friend who works for Intel just told me that they are laying of 12000 jobs which is apparently 11% of their workforce.

http://www.bloomberg.com/news/articles/2016-04-19/intel-cuts-12-000-jobs-forecast-misses-as-pc-blight-takes-toll

That's uh, a big amount.
The PC is dead.

Josh Lyman
May 24, 2009


XPS 13 w/Skylake gonna be the best ultrabook for even longer.

Adbot
ADBOT LOVES YOU

Josh Lyman
May 24, 2009


WhyteRyce posted:

I remember when Paul Otellini was trying to push some super-duper DVR in the beginning of the age of streaming. BK came in and said gently caress that poo poo and it got unceremoniously dumped

For the streaming thing, didn't Intel completely misjudge how difficult it would be to work with all the content providers?
I remember this complete failure: https://en.m.wikipedia.org/wiki/Intel_Viiv

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply