Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Nintendo Kid
Aug 4, 2011

by Smythe
If they knew they could put out reliable 8 core parts right now, they'd definitely do it because they could charge more money versus the 6 core ones.

Adbot
ADBOT LOVES YOU

Nintendo Kid
Aug 4, 2011

by Smythe

Shaocaholica posted:

How can you tell which procs (SB and IB) are die harvested and which aren't? Are all IB dual core procs moving forward doing to be die harvested quads?

The only CPUs guaranteed to not be rebinned higher level procs with some manner of defect, are the most expensive CPUs in any given family.

Core Solo and Core 2 Solo processors for example, were almost entirely Core Duo/Core 2 Duos with a minor to major defect in one of the cores.

Nintendo Kid
Aug 4, 2011

by Smythe

HalloKitty posted:

I actually seriously want a 15" Macbook Pro because you can get it with a 1680x1050 anti-glare display.

Obviously I can't afford one, but I support the fact that Apple offers a myriad of 16:10 options. 13.3"? 1440x900.

Delightful stuff.

Dell and Lenovo usually have 15 inch laptops available with 1920x1080 or rarely 1920x1200. I've occasionally seen them available from Toshiba and others, but they're rare there.

Also: Why can't you get anything better than 1280x800 on the 13 inch MacBook Pro? Just put the drat panel from the 13 inch Air in there so people can get 1440x900. :argh:

Nintendo Kid
Aug 4, 2011

by Smythe

Factory Factory posted:

gently caress, the Sony Z can have a 13.3" 1920x1080 screen. That's a crazy laptop.

poo poo, what I wouldn't give to have that dpi on a desktop monitor.

Nintendo Kid
Aug 4, 2011

by Smythe
I have a Dell XPS 15 laptop. It has a 1920x1080 15 inch panel and it's on the consumer side.

Nintendo Kid
Aug 4, 2011

by Smythe
XPS starts at $750 and the cheapest one with 1080p screen is $1040 at the moment (with Sandy Bridge i5, decent nvidia card with optimus, the screen is RGBLED backlit, yada yada). Inspiron is the other Dell consumer branding and that's frankly mostly crap.

It used to be that it went Inspiron -> Studio -> XPS but they scrapped the Studio level and put most of them into XPS and a few into Inspiron as the better models.

Nintendo Kid
Aug 4, 2011

by Smythe

Alereon posted:

It's how my mom uses Netflix and watches long videos. It's also a pretty popular way to watch slideshows, like from vacations. Though these are all people with laptops who can just run an HDMI cable to their LCD TV, I don't know of anyone who would use an HDTV as a monitor for a desktop.

Best Buy will tell people to buy a desktop and hook it up to just a 1080p TV though, and I've known lots of college-age kids who get a TV as dual purpose monitor and TV for their dorm room.

Nintendo Kid
Aug 4, 2011

by Smythe

mayodreams posted:

I am really pissed about the lack of Thunderbolt devices right now. Apple has been shipping machines with it since like April, and the only things we have available are expensive storage solutions, one video interface, and the Apple Cinema Display. I am dying for the dock that Belkin demoed over the summer that has more USB ports, Firewire 400/800, and a Gigabit Ethernet connection. I'd even settle for just a TB to gigabit adapter for my MBA because the wireless only is killing me.

I hope that whatever issues are causing a lot of delays in these products clears up before TB goes to a wider audience because it is not off to a good start.

I've highlighted why the problem is happening. Only Apple has any Thunderbolt-capable computers at all, unless you count Sony's one laptop that has a port not shaped like the standard Thunderbolt, which also does USB 3.0, and that carries Thunderbolt signal to a special dock.

Asus, Acer, Sony, and Gigabyte might have actual Thunderbolt support starting this spring. HP outright claimed they weren't going to do Thunderbolt for the next few years earlier this year. Dell, Lenovo, Toshiba, et al have all neither released any plans to do Thunderbolt at any time, nor have they said they explicitly won't like HP.

So there's exactly one manufacturer with actual Thunderbolt plugs available on their computers, and it's Apple, and it looks like it will be that way until this spring when Thunderbolt will have been out a year. When you also consider the fact that there's not much incentive for peripheral makers to support Thunderbolt when using USB instead works on all Macs in use, let alone all PCs in use - it's no surprise.

Nintendo Kid
Aug 4, 2011

by Smythe
Yeah although high data rates are in the spec, few devices use them other than Thunderbolt stuff.

Also Thunderbolt as Light Peak was originally designed along the lines of completely different connections and then shoehorned into copper and the mini-Displayport jack.

Nintendo Kid
Aug 4, 2011

by Smythe

Longinus00 posted:

The point is plugging a device into someone else's macbook which is just lying around somewhere is a lot easier than opening their desktop and stuffing a new card in/soldering a new bios. Especially since you can do this while the computer is running and maybe be able to get something out of it even if it's only plugged in for a few seconds. Imagine cruising around a library with a worm payload on a flash drive back when all windows machines would autorun.

Even if there are some restrictions to plugging devices without user interaction it doesn't prevent lending/borrowing of malicious devices. You're far more likely to borrow a tbolt device than borrow someone else's pcie cards or randomly solder poo poo to your computer.

Firewire did this too. Also I'm pretty sure this could be done with PCMCIA and ExpressCard on laptops!

Maybe you should ask yourself why you'd need to worry about your friend giving you a malicious thunderbolt device?

Nintendo Kid
Aug 4, 2011

by Smythe

Toast Museum posted:

Wouldn't the main risk be infected storage devices?

Well did you ever worry about that when using a Firewire connected drive?

If you have an infected Thunderbolt storage device, it's not likely to be able to steal the info someone wants, and then send it back to them unless you return the device to the malicious person. It's not likely that a malicious program installed on someone's computer could reprogram a thunderbolt device to execute an attack on its own either, and wait for the device to go into another computer to do it.

Nintendo Kid
Aug 4, 2011

by Smythe

Bob Morales posted:

1920x1080 on an 11.6"?? WHY?

Why wouldn't you want that, as long as the GPU can keep up?

Nintendo Kid
Aug 4, 2011

by Smythe
Well to be fair, technically there are a few random arcane workloads that Bulldozer performs better in. But even there it comes at the expense of higher power draw.

Nintendo Kid
Aug 4, 2011

by Smythe
Uh, is there actually any evidence that ARM servers were any kind of popular already, or going to be anytime soon? It's a bit premature to cal Intel late.

Nintendo Kid
Aug 4, 2011

by Smythe

Shaocaholica posted:

So Intel used to have some ULV mobile procs like the U7700 which had a TDP of 10W. What ever happened to that class of CPU? Seems like the lowest power IVB is 17W which is considerably more than 10W of older 65nm C2Ds. What gives? The embedded GPU?

Yeah I think the overall power usage of a system using that IVB part is lower than the comparable system with a 10w mobile C2D and a separate integrated GPU and so on back in the day.

Nintendo Kid
Aug 4, 2011

by Smythe

coffeetable posted:

This was the chip thread highest up the page, so while not relevant I couldn't find a more informed place: what's going on with the 8gb of GDDR5 system memory in the PS4? I was under the impression that 1600mhz DDR3 was more than enough to keep the CPU sated, and that GDDR5 was a drat sight more expensive. Is a couple of million sticks of the stuff enough to bring the price down that it was cheaper for Sony to grab 8gb of GDDR5 rather than 4gb of each of it and DDR3?

I'm pretty sure the DDR5 RAM is being shared with the GPU in the system. So they need the bandwidth already for the graphics.

Nintendo Kid
Aug 4, 2011

by Smythe

Palladium posted:

Moore's Law is now biting back at their rear end when even 5 year old PCs are too powerful to the average Youtube cat video viewer

This isn't true. Generally people stuck on those kinds of computers actually do have trouble doing things on the modern internet, not least because they're stuck on XP or Vista and also are likely to have a bunch of crapware installed that slows down their systems.


Palladium posted:

Add to the perpetually declining PC sales

If by perpetual you mean "once in 2001, and once in 2012". Because that's what happened, yearly PC sales have only declined over the previous yer in 2001 compared to 2000 and 2012 compared to 2011.

Nintendo Kid
Aug 4, 2011

by Smythe
Or they could just ditch making laptops altogether and avoid the hassle of attempting a third architecture switch on their OS. Frankly that seems a lot more likely then making new ARM OS X laptops that also perform well enough in OS X for people to still want, while still coordinating things with x86-64 iMacs and Mac Minis in OS X.

Nintendo Kid
Aug 4, 2011

by Smythe
What kind of revenue stream has AMD had going from being the GPU provider in the 360 and Wii? And what was the revenue like when Intel was providing the CPU for the original Xbox?

Nintendo Kid
Aug 4, 2011

by Smythe

hobbesmaster posted:

But that's all smartphones were back then, a phone with a terrible browser and terrible email support.


cstine posted:

And 3rd party apps - which was the big missing thing at the iPhone launch.

But no, I really don't think ANYONE expected smartphones to blow up, because at the time, they were all clunky expensive piles of junk.


So? They were more popular then PDAs had been, and we'd already seen "cell phones in general" reach billions sold by that time. As well already knowing that regular computers had taken off.

The only question back then was "when will everyone have a smartphone" not "will smartphones ever be popular". Heck at that point a lot of features once considered exclusive to smartphones - like real data connections, browsers beyond WAP, music playback, screens that could actually show something besides text - had started appearing on regular phones.



And for what it's worth, for Intel, getting the iPhone SoC contract wouldn't have necessarily guaranteed success in the mobile arena. Not only have there been a lot more other smartphones sold and tablets, but Apple could have switched off of Intel to do chips on their own just like they did with Samsung's ARM CPUs from the original iPhones.

Nintendo Kid
Aug 4, 2011

by Smythe

WhyteRyce posted:

I don't believe switching from ARM to ARM isn't quite as big of a deal as switching from x86 to ARM or vice versa, but I could be wrong. It's not an impossible thing to do but you would have had some inertia on your side and Apple hounding your rear end to make a better product that fit their needs. Instead, Apple went off and figured out they are perfectly happy designing their own stuff and got lots of experience doing it. Now Intel has to make something that much more compelling to get Apple to consider switching.

Intel definitely did not have a suitable low power chipset ready besides the XScale ARM stuff they sold off in 2006, around the time Apple would have been shopping around. So it'd have to have been ARM XScale or nothing.

cstine posted:

Those pre-iPhone/Android smartphones were pretty bad - you're looking back through some rose-colored glasses and being smart enough to figure out which end is up on a power button - my mom couldn't figure out how to use her RAZR, and she's on her fourth iPhone - and, don't forget - Android looked like a Blackberry until very very late in it's development cycle.

I'm not looking at it with rose colored glasses, people were well aware "when a new category of computing comes out its niche, but it'll eventually be everyone-usable" by 2005-2007. So Symbian sucked poo poo, big deal, any home computer you care to name in the 80s was pretty awful for the public but that changed very quickly. The question going on was "will smartphones be mainstream in 2010 or not til 2020?" not "will smartphones be mainstream". Lots of companies made decisions based on "yeah its going to take a while" instead of "its going to be real soon".

Nintendo Kid
Aug 4, 2011

by Smythe

Ryokurin posted:

They had been saying that for years by that point, and like you said features were starting to overlap. But no one saw items like music streaming or a full featured browser as a potential game change. The need for data was still pretty much seen as something that only business people would need and if a consumer needed it, it was for WAP or IM. People also forget the iPhone was $600 back then. That's why no one took it seriously. Hell even today that price for a unlocked phone makes people cringe.

Again, all of that was about "it will take 13 years for smartphones to be everywhere" instead of "it will take 3 years".

Also the iPhone still costs that much if you buy it outright. Tons of regular phones then and now cost $300-$400 easily if you bought them unsubsidized. Like, duh, phones costing a lot of money was nothing new. In the 90s getting a plain old cell phone unsubsidized could easily run you more than $1000.

The iPhone 2G wasn't taken seriously because not only did it cost as much as all the other unsubsidized smartphones, it also did not have any apps or 3G service but still cost as much as devices that did. Which is unsurprisingly why the iPhone 3G released a few moths after the app store finally opened sold a shitload better.

Nintendo Kid
Aug 4, 2011

by Smythe

necrobobsledder posted:

I would hesitate to call PCs the same sort of "durable goods" economic category as refrigerators, cars, and construction equipment, but I was moreso looking at the angle from the perspective that Intel can only push the rest of the market (and more importantly, their customers) around so much, including their DRAM partners when their customers' shiny object dreams are changing in directions that put Intel in a position where they're not that important for the solution.

They actually kind of are. Sure they haven't reached the point like cars where the average one on the streets is 10 years old yet, but there's tons of people using them for 4+ years, and will be a lot more as we proceed onwards.

Nintendo Kid
Aug 4, 2011

by Smythe

Josh Lyman posted:

I believe Intel's mobile chips are 32 bit, and besides, it's unlikely Apple will switch away from custom silicon. Remember, an iPhone 5 gives you a full day of usage with less than half the capacity of a AA battery.

An alkaline AA battery holds 8100 joules. The iPhone 5 battery holds 19,260 joules (the 4s and 4 were 19,000 joules).

Nintendo Kid
Aug 4, 2011

by Smythe

Josh Lyman posted:

I was referencing the iPhone 5 have a 1440 mAh battery and a AA having 2700 mAh.

1440 mAh at 3.8 volts is significantly higher capacity than 2700 mAh at 1.5 volts. An AA is 44% of the power stored as an iPhone 5 battery, not the other way around.

Nintendo Kid
Aug 4, 2011

by Smythe
An interesting side effect of that is that a lot of those "list your system specs" programs people would use at the time would identify both Pentium M and first gen Core Duo/Solo chips as "Pentium III" even while identifying the Pentium IV and the contemporary AMD chips correctly.

Nintendo Kid
Aug 4, 2011

by Smythe

Alereon posted:


But isn't ARM too slow? Too slow for what? It would be a long time before Apple could switch Macbook Pros (or iMacs or Mac Pros) to ARM, but their low-end devices like the Macbook Air and the now-retired Macbooks (and maybe even Mac Mini) are targets. ARM processors CAN scale up in performance, that is exactly what Apple has done with the Cyclone core in the A7 CPUs. More importantly, Apple would redesign their product stack around the strengths of the CPU lineup they actually had available to them.


ARM designs aren't putting out anything that can approach latest and second to latest generation Core i5s any time soon. And that's what's in the Airs and the Mini as the low end option.

It'd be a hell of a thing to "play up the strengths" to a level justifying how expensive those two products are in their base models while still having the real performance as low as they'd be.

Nintendo Kid
Aug 4, 2011

by Smythe

Alereon posted:

Intel thinks four Silvermont cores are a viable alternative to two Haswell cores for ~15W and below and low-end markets. Apple's Cyclone cores look like viable competitors to Silvermont at comparable TDP. Apple "redesigning their product stack" means just that: if they can't achieve Macbook Air performance (and thus price) with their SoC, call it something different or charge less. We're trying to fit ARM into Apple's existing Intel lineup, they're going to have a new lineup if/when they do this.

Apple doesn't like to charge less ever. Their entire Mac business is based around not selling any laptops under $1000 and the Mini starts at $600 - though that's just for the computer and its power cord. They don't really make "low end" stuff, they just kinda accept they're never going to exceed 5% of the computer market ever again and stick to being a "high end" mark with mid to high end products.

And the slow ARM not-Macbook Air would find itself trying to compete with its kinda-OSX-but-not-really OS in a world where your average Windows laptop sells for around $400 and does have the ability to run everything people want. You might remember that this was same kind of "doesn't run any of your stuff you already have besides websites, but has great battery life" niche the original Linux-only netbooks tried to pull, only with nothing near the price advantage they had. And that those were quickly replaced by slightly more expensive, slightly lower battery life but full windows netbooks.

Nintendo Kid
Aug 4, 2011

by Smythe

Alereon posted:

You're going down this weird rabbit hole of worst-case scenarios. "What if Apple's SoC isn't good? What if they don't notice to late and the only way they have to compensate is by cutting price? What if it can't run OSX? What if they just switch to making lovely low-end Chromebook-ripoff devices how would they compete with Window?!" None of those are real scenarios. Apple's SoC will at least be competitive and Apple has a lot of different ways to tweak system performance to give users the kind of experience they want (such as their investments in fast PCI-E SSDs).

This isn't the worst case scenario, this is the best case scenario. There's nothing in switching to ARM for apple here that wouldn't be solved by simply having an official keyboard and touchpad for their iPads; and the Apple TV is already the ARM equivalent of what a Mac Mini ARM edition would be useful for.

And yes it wouldn't really be running OS X since real OS X will be sitting right there on their main computer products and is clearly going to get the lion's share of application support and importance.

An ARM version of a Macbook Air would be a lovely laptop that performs poorly and doesn't have the benefits of either the iOS or the OS X ecosystems, and would only have the dubious benefit of marginally better battery life. An ARM version of the Mac Mini would be a marginally more capable Apple TV, which you could already get if they simply upgraded the Apple TV again. These are best case scenarios, excepting the unlikely scenario of Apple switching all their stuff to ARM which would at least mean those things not being the third Apple OS ecosystem.

Nintendo Kid
Aug 4, 2011

by Smythe

bull3964 posted:

The biggest issue I see is they aren't going to want to simultaneously run OSX on ARM and x86. They will either make a clean break to ARM for all their computers or fork the consumer OS into something else entirely.

Consumer OS is all they have though. They killed their actual server business, and they've done plenty to drive away professional use as well as doing plenty to not encourage new use.

In essence, Apple is extremely ill-suited to being the people bringing a successful ARM switchover into being.

Nintendo Kid
Aug 4, 2011

by Smythe

Alereon posted:

I don't know why you keep repeating that ARM-based products are slow like it's a mantra. An ARM SoC in a cellphone or tablet is slow because it operates in a 1-4W TDP. Apple has designed a core that they can use to make a fast SoC in the ~10W+ space. Current products aren't overcoming the Turbo Boost advantage on i5s but we're not talking about current products.
This is a very valid point, though Apple has supported multiple architectures simultaneously in recent history. It certainly wouldn't be fun for them and they would only do it if they thought it was worth it or had ways to make it much less troublesome.

There aren't any ARM SoCs performance-competitive with the i3s in cheap Windows laptops, let alone the i5s Apple has as their baseline in their computers. There won't be any anytime soon, because Intel is doing super well on continuing to run up performance and ARM designers at a whole aren't showing signs of being able to catch up fast enough. And again: on top of this Apple has already based their entire computer section on charging high prices and not selling anything low-end - putting in ARM replacements doesn't make sense there particularly when they already have the iPad line out there.

Apple never really supported multiple architectures. They have twice before had one old architecture being kept on life support and one new architecture that got all the new stuff. That's a plan for a transition of everything, not a plan or experience for actively maintaining two at once long term.

Nintendo Kid
Aug 4, 2011

by Smythe

eames posted:

Warning, wild offtopic speculation ahead :siren:
I think the more likely route is a Macbook Air that can be separated at the hinge, with a fully functional but even lighter iPad Air as the display and a powerful CPU/Battery/IO/Keyboard for heavy work as the (optional) "base" part.

It’ll take a few years until we get there but that seems to be the obvious solution to me, although I have no idea how it would work on the software/OS side of things. I find it unlikely that they would mix two different architectures in one device, though.

Honestly since iOS devices have already sold much more in the past 6 years they've been around than all Macs put together have sold since 1984, it's probably best for them to just continue focus on "make better iPads" rather than "attempt to cut up the small OS X market with cut down OS X computers on ARM".

One thing you could do with a docked iPad is simply provide it a way to cool down better, thus letting the CPU it already has sustain higher performance as long as it's in the dock. This could include simply putting the higher power CPUs in the iPad itself and having iOS simply refuse to run it in a high power state unless it detected it was in the appropriate dock.

(and yes I'm aware the article I linked is from last year, but since then the gap between iOS devices sold and Macs sold has gotten even huger)

Nintendo Kid
Aug 4, 2011

by Smythe

GrizzlyCow posted:

Apple has gotten a lot bigger since then I do believe.

Not in the sector of the market where Intel CPUs are used though.

Nintendo Kid
Aug 4, 2011

by Smythe

shrughes posted:

Oh they certainly have.

They most certainly have not. A bit under 5% of the computer market last year up from 3.5% of it the year they switched from PPC to Intel is not "a lot bigger". For comparison, Lenovo and HP were both pulling 15.5% with Dell and Acer both doing 10% last year.

And compare Apple's sub-5% in computers to its 25% in smartphones or 50% in tablets, and how both of those much more successful markets for them are all-ARM, so much so that last year they sold more iOS devices that year alone then Macs sold over all time. There's your impacts. Like seriously, somewhere around 190 million Macs sold since 1984 versus well over 200 million iOS devices in just 2012.

Nintendo Kid fucked around with this message at 04:57 on Nov 4, 2013

Nintendo Kid
Aug 4, 2011

by Smythe

shrughes posted:

Apple's laptop market share is greater than 5%. Maybe you're thinking of the overall computer market share.

They sold about 19 million laptops out of the 200 million laptops sold (the total computer market was about 310 million, so laptops were about 66%) so thats a bit under 10%. Other companies again sold much more. But again, they started off with about 7% of laptops sold in the year they switched from PPC to Intel, so that's not massive growth.

Again in comparison, Lenovo pulled in 16% share, the best they've ever had, HP was doing 15%, Dell was around 13% or so.

Nintendo Kid
Aug 4, 2011

by Smythe

shrughes posted:

The size of the market grew like 300-400% though.

And most other companies grew much more than them. Take Lenovo for instance, they've nearly tripled their share of the market from then to now.

Any way you slice it, Apple's only big growth has been in things that do not use Intel parts. Music players, smartphones, tablets, all on ARM.

Nintendo Kid
Aug 4, 2011

by Smythe

Vanagoon posted:

The Devil's Canyon logo is hilariously bad-rear end.



Holy poo poo Intel.

Personally I woulda used that for the last Pentium 4s.

Nintendo Kid
Aug 4, 2011

by Smythe

Cardboard Box A posted:

This might be a kind of weird question but is there anywhere I can find some decent or even comprehensive benchmarks and comparisons between old Core Duo chips like the E5200 and modern low power Celerons like the Celeron 1037U or Celeron G1820? More than the minimal info that sites like CPUBoss have.

The E5200 was the "Pentium" aka low end version of Wolfdale Core 2 Duo CPUs 6 years ago, so both the 1037U and G1820 parts should handle anything the E5200 could, unless the workload you have in mind specifically needs the additional processor cache the E5200 had. The main difference will be that the Celeron parts also have onboard GPUs.

However, the E5200, assuming you already have it will perform slightly better on most tasks that don't use any newer instructions, so if you were considering replacing one with those Celerons, it wouldn't be worth it.

Nintendo Kid
Aug 4, 2011

by Smythe

Cardboard Box A posted:

Thanks for your answer.

I have underclocked and undervolt the E5200 down to 1.2GHz I think, and I still don't think it's anywhere the low power the 1037U can get to, so I'm sure it can't compete there, but maybe it will do ok once I bring it up in clock a little.


Ah I see, yeah if you're after low power draw, those Celeron parts will outperform that Wolfdale watt-for-watt. It's just if you're willing to use full power draw where the E5200 would come out ahead.

Adbot
ADBOT LOVES YOU

Nintendo Kid
Aug 4, 2011

by Smythe
The standard horizontal sits-on-a-desk top case fell out of style about the same time as laptops became the majority of computers sold and everybody stopped having bulky CRTs that took up roughly the same amount of desk space anyway.

Since the "desk top" case design was intended for space saving in combination with a CRT, they didn't have much use once you could fit an LCD and a minitower in the same space, or forgo the desktop computer altogether.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply