Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

HalloKitty posted:

If it's such a cost issue as they always say, why was it so viable in the Sandy Bridge days, when Sandy Bridge cost less comparatively than the newer CPUs?

If only they would offer a die with no GPU and the IHS soldered, it would no doubt cost less to produce overall, and be better in every way that matters to someone buying a top-end CPU for the desktop.

Probably because when SB dropped people still at least remembered when AMD produced competitive products.

AVeryLargeRadish posted:

Ehhh, it would also mean setting up another production line to serve a very small market, a market that they have already captured because AMD can't compete. Production lines are not cheap.

I guess that goes back to "how hard is it to simply not include the iGPU" when making the chip. Clearly the addition/subtraction of a few megs of L3 cache isn't a big deal, and what specific iGPU goes into chips varies quite a bit, as well, so there's already some precedent. It does seem silly, though; if you're going to make a "gamer oriented chip," why waste even $1 on an iGPU that your entire market demographic is going to disable the moment they get it? Then again, they did the same thing with Devil Canyon, so who knows.

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Ragingsheep posted:

Would it be possible to eventually use the iGPU for something like physics in games (similar to how you can use a second graphics card for PhysX).

Sure, but that would involve NVidia sharing their PhysX IP, and I'm pretty sure you can guess how likely that is to happen.

Closest thing to useful I've seen out of the Intel iGPUs is that they are available to some video compression platforms and can encode stuff impressively fast (though apparently almost always at lower quality than what CPU-pure encoding of the same video would produce).

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Twerk from Home posted:

If you're after the highest minimum FPS, you should be looking for an i7-5775c: http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/14


Or take the hundreds of dollars you'd save not buying Skylake or a $500 5775c and buy a better/second GPU.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
I will be "passing down" my 2500k to my "wife"* once "Intel" puts out a "CPU" worth "bothering with"**

But in all seriousness I'm real curious as to how the i5-5675c will turn out price-wise. With the 5770c actually beating the 6700k in gaming, if the i5 actually shows up <$300 or so like its MSRP suggests, that'd seem like the optimum price:performance point for gamers. 128MB L4 cache for all my friends!

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Combat Pretzel posted:

Are there definitive benchmarks showing that it makes a difference?
TechReport's review has the 5775C in the mix. It generally beats out the 6700k at anything gaming related by small margins, but is ~20% slower in the non-gaming CPU tasks (compression, mostly). That's at stock clocks, of course, and it doesn't look like Skylake is gonna do much in the way of overclocking.

Mind you, it's not a night-and-day boost to gaming, so there's probably a lot to be said for the other benefits of the Skylake ecosystem in comparison. Still, sad that the 6700k loses out to a similarly-priced 65W chip from a few months back.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Toast Museum posted:

What's so bad about micro USB?

It's so tantalizingly close to being omni-directional. But it's not. So you end up trying to jam the damned thing in the wrong way because it almost fits, and end up damaging it.

I mean, not that I have ever done that, because I'm a careful and exacting person. But I have a wife and, uh, yeah. Thankfully so far it's only damaged the cables, and not the phone(s).

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

dpbjinc posted:

USB cables always plug in with the USB logo facing upward. That's literally part of the standard. It makes life so much easier knowing that.

You're welcome to explain that to said wife at 11pm when she's already turned the lights off and is trying to fumble the plug in to her phone on the nightstand. Also you assume standards-compliant cables instead of whatever $5 cable is in the discount bin at BestBuy/Walmart/etc.

I'm not saying it's a terrible standard. I'm saying that Apple's reversible connection is about 1000% better for 90% of the population, though, and USB C should be, as well.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

VostokProgram posted:

that's what the USB port said

"It'll go in, trust me, you just gotta push"*

*actual quote

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Botnit posted:

How long do you think it'll be before we start getting 3.1 external HDD's?
What would be the point? Unless you're talking external SSDs, USB 3.0 / Gen 1 is already way faster than the HDD attached to it. I could see wanting USB C for the reversible connector, but I wonder if they could just attach that to an existing USB 3.0 chipset.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

cisco privilege posted:

My motorola phone is thin as hell and apparently supports wireless charging. Without any kind of wireless quick charge though it's easier just to plug it into the quickcharge2.0 adapter for 15 minutes every couple days.

This is a big limiting factor for Qi right now: the power profiles for many of the charging stations are really, really low. Which means either they charge slowly, or quite frequently they don't charge at all because they're "too far away." I'm sorry, lovely Qi charger, but if the 3mm of my slim-profile rubber case is "too far" then you should probably look into sucking less.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Josh Lyman posted:

Considering the Apple Watch has wireless charging, I wonder how long it'll be before the iPhone has it. Maybe the 6S?

Considering the charging loop costs all of like 25c, the answer is probably some mix of "when they can get it working reliably though an aluminum back" and "when Qi pays them enough."

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Apparently they're available some places in Europe/Asia, and there are import stores that will happily sell them to you. For a fee, of course.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

LiquidRain posted:

Live in Tokyo. Went to Akihabara. 3rd store we tried had it in stock for 50,000Y post-tax. ($400 USD - PC parts come with ridiculous markups here) Guy reached into the stock shelf and grabbed one!

They also have the i7-5770C in stock here if that's your fancy, at about 55,000Y.

That's actually not much of a markup at all. Newegg has them listed for $360, which is about $385 after tax.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Krailor posted:

That will probably replace the Thin Mini-ITX standard and a 65W TDP conveniently lines up perfectly with the Iris Pro Broadwells.

Yeah, considering that it drops some of the nice features from mITX (full sized DIMMs, PCIe), and the only advantage it has over a NUC is the support for a socketed CPU, it seems pretty clearly aimed at the reality that there are people who like the SFF of the NUC but need a bit more oomph than a 15W CPU can provide. Which is literally all this does--gives Intel a way to let people stuff a 65W CPU into a SFF box without having to bother releasing a new lineup of soldered sets.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Skandranon posted:

While AMD is not doing great these days, ARM based CPUs have been nipping at Intel's heels for awhile, so hopefully if AMD goes under, someone will be able to keep Intel from raising their prices too high.

Too bad that someone is China.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Rastor posted:

Chromebooks iPads are rapidly becoming [a secondary device to dick around on while most real work still gets done on] the standard school computer.

Fixed for accuracy and more relevance to ARM.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

VulgarandStupid posted:

That's not true at all. I think you over estimate the market share that gaming enthusiasts hold. There will always be enterprise purchases rolling in as new businesses pop up as well as older machines dying. The thing is, there is no reason not to buy the newest generation, assuming you need a new computer, even if it is only incrementally faster and more power efficient. Intel's pricing structure or lack of discounting old stock guarantees that.

It's true in the sense that people--and businesses--don't like to spend money when they don't have to. The less impressive a new hardware generation is, the more likely they are to extend the life of their current systems because the cost of upgrading is not commensurate with the improvements over the current setups; and that's at every level, not just high-end gaming. Obviously there are exceptions (some companies upgrade on a fixed X year cycle come hell or high water), but tepid new releases result in tepid sales in a pretty predictable manner.

That's not to say that Intel's slacking on performance upgrades will result in zero sales--that obviously isn't the case--but there hasn't been a huge rush to upgrade desktops for several generations now, and for good reason. The laptop world has obviously been quite different, as the advantages of Haswell were pretty enormous.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Rastor posted:

...Last year, about 3.9 million Chromebooks were shipped in the education sector...

As you yourself noted, a lot of those aren't ARM based at all, and going forward even fewer Chromebooks look to be ARM based (or at least there'll be more Intel-based options, at any rate). The educational market for Chromebooks/iPads/Surfaces/whatever is already pretty dinky compared to...well, almost everything else. Especially since some of the initial "test bed" locations are turning away from them after finding that they didn't produce the benefits they were hoping for, and ended up being quite expensive in the long run, but that's not really the device's fault.

Don't get me wrong, I love using my iPad for publications and reference books and whatnot, but they're not going to take the K-12 world by storm as long as textbook publishers are demanding silly amounts of money for digitized books, and most college students are going to have a laptop first, and some tablet device second (if at all).

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Twerk from Home posted:

Can anybody with firsthand experience with a NUC tell me if they support RAM overclocking? The Broadwell NUCs support DDR3L-1866 natively, which is cool. It looks like the Skylake NUCs support DDR4-1866, which is terrible and should be marginally slower given the 50% greater latency, right? If I could drop in good DDR4 and clock it at -2400 or -2666 it would all be fine though.
DDR4 has higher latency, but higher bandwidth as well, and the two more or less cancel out when comparing them at the same clock speeds. What are you doing on a NUC that you care about RAM performance, though?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Twerk from Home posted:

In that comparison, it looks like they set both DDR3 and DDR4 to run at the same clock / timings, in my situation it's comparing DDR3L-1866 CF9 to DDR4-1866 CL15.

They're comparing DDR3-2133 C11 to DDR4-2133 C15 (or C14? They're a little unclear). If they were actually the same clocks and timings, DDR4 would be expected to show better performance across the board, rather than trading blows.

At any rate, with the timing differences you're looking at, you could expect a few percent loss compared to DDR3, but nothing that you're likely to notice in your usage. In that similarly clocked DDR4 seems to help out gaming minimum framerates, while DDR3 seems to help out average framerates, I'd even venture to say that you'd be better off with the DDR4 straight up, unless you plan on substantially upgrading that 5770 of yours.

Rastor posted:

Did you actually read that article?


So in the education market the Chromebooks are outselling the Windows devices, which outsell all the Apple devices combined. Which proves my point, that "Chromebooks are rapidly becoming the standard school computer."
No one cares who is outselling who, because it's all a tiny rear end drop-in-the-bucket market that is nowhere near critical mass or a "standard".

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Twerk from Home posted:

I'll wait for the Skylake NUCs with no reservations that buying a Broadwell would be better, thanks.

Quite frankly, even with a slightly weaker RAM subsystem, you should still get better performance out of Skylake. Not by a whole lot, though, to the point where if you can find a deal on Broadwell (or, hell, even Haswell), you wouldn't be missing out on much by going that route and saving some money.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Rastor posted:

When a school is buying a school computer, by sales ratios that computer is likely to be (1) a Chromebook (most likely), (2) a Windows system, (3) an Apple device. As the most likely purchasing decision by schools, the Chromebook is "the standard school computer" -- that is, the normal / average / example computer schools purchase.

2014 educational sales according to the article you linked:
Chromebooks: 29%
Apple (not-iPads): 12%
iPads: 20%
Windows "devices": 37%

So even by your own article, Chromebooks are not the most common platform, and are nowhere near hegemonic enough to be considered "standard." And that's not even starting the argument over use--the college student who has a laptop and a Chromebook/iPad, for example.

Rastor posted:

To loop it back around to a semblance of relevance to the discussion: Windows on ARM was DOA because there were no apps, but Chrome OS is architected such that all Chrome apps must work on both x86 and ARM processors.
Totally agree. ChromeOS is actually pretty cool, and I think Chromebooks fill a pretty decent market segment (Once there's a 15" one for suitably cheap, I absolutely will be picking one up for my mother-in-law, because holy hell some people just should not have full computers, and even the current batch with iffy CPUs are fine for playing all her Facebook games for 10hrs a day).

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

FormatAmerica posted:

FWIW, I've seen a couple counties go from iPads > Chromebooks > Windows laptops over the past two years trying to figure it out due to the tremendous friction of switching off of office filetypes (goog still doesn't really open/save natively, likes to make copies of files which is an ECM nightmare) and straight up objective productivity gains (content production is a joke on a tablet).
There's also the issue of enormously expensive vendor lock-in on a lot of the platforms. Book publishers and vendors of "productivity software" (homework kits, etc) are laughing all the way to the bank in a lot of cases.

Bonus: there's research suggesting that people actually retain less information when they read it on a screen vice on printed paper, so the entire idea of going with a digital textbook library may be counterproductive from the outset (though there's still a ton of work to be done on that topic).

Rastor posted:

According to the article Chromebooks were outselling Windows devices in 1H 2015.
Not by an enormous amount, and you still have roughly a 2/3 chance of any given device not being a Chromebook. I love me some Google, but saying that Chromebooks are somehow standard is like saying AMD competes directly with Intel on any sort of meaningful level--it's only true for those people who really really want it to be true.

DrDork fucked around with this message at 04:26 on Aug 27, 2015

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Nintendo Kid posted:

Well it certainly doesn't help that some ebook textbooks require awful software or have terrible layouts that almost seemed intended to be unusable.

The secret is to be awful, but at a slightly lower price than the printed books, so the district buys them because it "saves money". Everyone who actually uses them then bitches and complains because they're terrible. Then next year you release a new version that's more expensive but slightly less awful and everyone demands the district pony up for the better product.

Congratulations! You've now made 1.8x the revenue you would have from selling the single perfectly usable printed book you would otherwise have sold them.

Include one-time-use "companion codes" to access online quizzes, and you can continue making 40-50% of the original book's price for no further effort every god damned year!

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Drunk Badger posted:

Now that we know about the new CPUs, will an upgrade from a 2500K still not be worth it?

2500k @ 4.6 is still where it's at for games.

Skylake has some meaningful performance increases if you spend a lot of your time doing video compression or whatever, but that's about it.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Yudo posted:

It's not like the rest of the world is sitting on their hands and not making good products.

Until I can play TW3 on it, I care not for these "other chip makers" :colbert:

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

PerrineClostermann posted:

I think unironically that a phone running full Windows would not be a terrible device and, with a little tweaking, would be awesome.

People would still complain that the screen isn't as nice as their Sony Trinitron.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
So what you're saying is.... we should harvest the primo chips out of high-end laptops and re-purpose them for l33t overclocking chips! Now, how to de-solder the buggers...

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Boiled Water posted:

Mid tier CPUs are cheap. Hell all Intel chips are cheap. If you want high end over the top idiocy give ibm a call.

Mid-tier CPUs aren't egregious, but they're certainly padded with pretty fat margins (and far fatter than in years past) because Intel has the luxury of being a defacto monopoly on everything north of "poo poo box" in the desktop market, and everything north of "Walmart special" for laptops. IBM meanwhile operates in a segment where cost-per-chip is almost irrelevant.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Edward IV posted:

I guess that kind of makes sense. It still sounds like a bad idea. And expensive. And heavy. Really heavy. God help you when you inevitably drop it.

If you drop it hard enough to crack an internal waterblock or line, you've already dropped it hard enough to break substantial other parts, so you're probably not substantially worse off than you would otherwise be.

But yeah, who thought that was a good idea?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Anime Schoolgirl posted:

There's very little practical difference between Skylake and Haswell.

4790k+H97 if you want to be a bit cheaper and want the caboose on the Haswell train

6700k+H170 when those boards come out..

Why would you suggest top-tier K-series chips for a guy who straight up says he doesn't plan on overclocking? I mean, I get that the 6700 non-k isn't out yet, but neither are the H170's. Not really sure that the 6700 would be worth it over the 6600 for RTSs anyhow (most of which don't show compelling advantages for an i7 vs an i5). Quite frankly my pick would be the 4690 or 6600 and dump the $100 into upgrading the 970 into a used/b-stock 980.

Also, it's absolutely fine to go with 8GB RAM instead of 16GB and save yourself a few bucks if you want to. There's no real way you're ever going to break 8GB in normal use.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Anime Schoolgirl posted:

The same thing applies to the 6700k, only it's more expensive and I only suggest it if you really, really want DDR4.
If you're gonna go for top-end, you might as well go whole-hog. The price difference between a 4970k and a 6700k is like $20 (though availability is obviously a bitch at the moment), and decent DDR4 is only about $15-20/8GB more expensive. The part that hurts right now is being forced into a Z-series motherboard.

I'd agree with you about the 4790 vs 4790k, but I was suggesting the 4690, which is $120 or so cheaper.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

VulgarandStupid posted:

If you can get a micro center 4790k and most H97 mobos are like $80, then you're going to save around $130 between the processor and mobo. And then a little bit more on the RAM. How is this even up for debate?

Because he doesn't live near a Microcenter and you can use the same H97 mobo for the 4690? However you cut it, the 4690 is ~$120 cheaper. I don't understand the confusion here.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Ironically, looking at prices and whatnot made me notice that my local Fry's has a i5-4690K & MSI Z97-G45 for $300, which might almost sorta tempt me to finally upgrade from my 2500k...

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

sauer kraut posted:

I built a cheap i5-4590 box early this year and min/recommended CPU specs for games are getting uncomfortably close already.
Would not recommend playing with that fire if you're prepared to spend well over a grand, you'll end up scouring ebay for a used clapped out overpriced 4790K in 2-3 years.
Not sure what sort of games you're playing where a 4590 won't be sufficient, especially since if you built a "cheap" box I'm assuming you didn't bother to pair it with a 980Ti. TW3 recommends a "minimum" of a i5-2500, and I can tell you that my 2500k is not what's constraining my FPS at high/ultra + 1440p, so methinks your 4590 will be fine--sure it's 20% slower than the 4970k, but unless you've got a GPU to match, it's probably not gonna matter much.

The Iron Rose posted:

That board has Killer ethernet, which causes bluescreens. Don't buy it.

Ugh. Why can't someone just take an Intel NIC and put some fancy red paint on it and call it a "high end gaming solution!" It'd be better than 95% of these custom-rolled chipsets.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

LiquidRain posted:

Honestly, RealTek is just fine.

On budget boards I agree, RT chips do, in fact, function. But once you start going north of $100, not going Intel says to me you're trying to cut corners in places you shouldn't.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
At resolutions neither the phone nor the computer are likely to be able to display, too.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Anime Schoolgirl posted:

Costs as much as a 4790k :downsgun:

But dat frame consistency...

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Col.Kiwi posted:

Eh maybe but I'd say its more likely theyre getting a smallish quantity from their distributor and they're just pricing them based on what it costs to import from another country, cause they know they're so in demand and so unavailable in north america.

Yeah, I mean as much as I'd love to see them closer to MSRP and all, the 5775C is often found in the $450-$550 range, when you can find it at all, so it seems that's just the price you're gonna pay until the supply issues get sorted out. Which may never actually happen. So for now I can't get too angry at NewEgg for pricing them at what's more or less market average.

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

VostokProgram posted:

I'm guessing there's a similar chip in the works for Kaby Lake. I can't imagine Intel completely throwing out the "let's have an enormous L4 cache" idea, surely there's some niche somewhere that'll pay Intel big money for those CPUs.

That niche is (among others, I'm sure) gamers who already drop $650+ on top-of-the-line GPUs to match with their $300+ CPU on their $150+ Z-series motherboard. The 5775C was not prohibitively more expensive than various other non-C chips, while offering meaningful improvements in ways that simple improve IPC and clocks don't really bring anymore. People were buying them for upwards of $550, despite the MSRP being more like $350. There absolutely are people who'd snap them up.

Now whether that niche is lucrative enough to warrant a separate fab line (since obviously these are a whole different ballgame than just binning and lasering off bits of chips that make up the majority of the Intel lineup) is another question entirely.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply