Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Toast Museum posted:

Tech-wise, what would Intel gain from buying AMD? GPU know-how seems like the big thing, but what about on the CPU side?

How to screw up their per-core performance. I'd love to give AMD a try, but ever since I started building my computers, Intel's had the best bang-for-buck in the enthusiast segment.

Adbot
ADBOT LOVES YOU

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
So what's up with these new Bay Trail Atoms? Why are they so damned good? I've got one in my tablet and it's blowing my expectations out of the water.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
What architecture was the old Atom based on? And the new one? And more importantly, does the new Atom have that nifty hardware support for fast AES?

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Shaocaholica posted:

Hmm. Back in school I was able to play back 720p offline h.264 on it although they were circa 2006 720p encodes.

If you're watching anime, the 10 bit ~future~ will demolish that CPU. Normal, 8 bit h264 should work if it's low bitrate.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Shaocaholica posted:

Not to derail this thread but where is all this 10bit anime coming from? Are studios releasing 10bit blurays already? I always thought that 8bit was deemed enough for the consumer and any extra bits would probably go towards compression/4k and not bit depth. Is there even a consumer 10bit signalling standard to go from Bluray > HDMI > TV?

It's purely due to benefits to compression and image quality, specifically banding. There are no 10 bit sources or displays, basically. Anime encoder Diaz did a whole thing on why if you Google it, and he's a lot more versed than me.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
This makes me wonder just how much of a difference my two most recent computers are. C2D e6750 @ ....3.5ghz? I can't even remember. Compared to my 2600k @ 4.5ghz.

Hell, how does my 2600k compare to whatever the latest processors are?

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Alereon posted:

The 45nm C2Ds were a lot better, the originals were often paired with P4 chipsets and slow DDR2 and that was not a good combination. A 4-series chipset with AHCI support and DDR2 800+ is in a much better position today.

Was the e6750 one of them? I think it was a 65nm...

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
12MB? drat. I think the 2600k only has 8MB.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
...It looks like the only difference is the slightly higher clock speed.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

canyoneer posted:

14nm is about ~30 silicon atoms laid end to end :wow:

I wonder how much engineering they had to do to deal with quantum effects at that level...

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
So is it just lack of competition from AMD that's making it not worth upgrading CPUs these days? Is it the lack of software needs? I mean, my 2600k and radeon 6870s still do spectacularly.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Any news in the low-power world of the Atom? AMD just released some info on chips designed to compete with the current set of Atoms and seem to be doing so favorably. With the world of high end being pretty much stagnant, I'm finding myself more and more interested in how powerful and powersipping my Windows tablets can be.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

carry on then posted:

I'm not convinced this is entirely a bad thing considering the longevity of any one PC build these days.


The reason for that, regarding gaming at least, is that it's been tied to consoles for ever.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

sincx posted:

I think it's sad that the days of getting significant processing gains from upgrading every 2-3 years is over.

It seems that my 2600k (main computer) and 2500k (media center) won't be replaced for quite a while.

I hear you. Even upgrading a GPU is almost questionable at this stage.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Agreed posted:

Well, what the gently caress am I going to do with a couple of suddenly last-gen parts? :cry: Never buy and then don't build a computer, it's the dumbest feeling. gently caress you too, bad back.

Build a computer only marginally less capable?

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

cisco privilege posted:

I'm not really feeling let down by my 2600K. It's nice having a ~3-year-old PC where the only upgrade I've wanted is the GPU. I did get a bigger SSD though as the 128GB 830 was feeling a little cramped.

I can't even do that. Dual 6870s are still really good.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

FormatAmerica posted:

Unfortunately Kerbal Space Program is one of them :jeb:

Emulation, too.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

r0ck0 posted:

But then you lose half of the resale value.

People resell computer parts?

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

MaxxBot posted:

The two main languages are VHDL and Verilog, with the latter being more popular I believe. I've done a fair amount of Verilog and it's not too bad, the syntax is sort of similar to C.

VHDL is the devil.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
I have a 2600k at 4.5Ghz. I couldn't justify an upgrade either, ended up just getting a solid state drive. CPUs haven't really improved too much in terms of performance since 2010.

(If you want an appreciable upgrade get a solid state drive. It's ludicrous)

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

1gnoirents posted:

Anybody pre ordering?

http://www.livescience.com/47240-ibm-computer-chip-simulates-brain.html

Probably run games at +60 fps at 4k now. 46-400 billion calculations per... watt

Architecture is vastly different, it wouldn't even run Crysis.
:goonsay:

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Agreed posted:

Big picture horrible, small picture "still pretty much completely satisfied with my 2600K setup, just waiting on it to croak to put the pre-refresh Haswell bits together and that'll probably last another several years" - I know that Processors Are loving Hard but still, that seems a rather vital feature to just screw the pooch on. Wow.

Starting to feel some of that lack of competition going on here.

Tell me about it. I looked at potentially building a new computer. Turns out, if I wanted better performance, I'd have to drop 800 bucks or something stupid to get more cores and slightly worse single-core performance. Or spend 500 bucks on a flagship GPU. Or 300 on faster, larger RAM. In the end, the only way to upgrade a PC build from 2010 without taking out a loan was...to get a SSD. Now I have no idea what to do. Nothing anywhere seems worth it. We live in weird times.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

HalloKitty posted:

That's fantastic, though, because you can save money and spend it on something else.

Edit: wait, you said 2010. That means Nehalem, so yes, you can definitely benefit from Devil's Canyon.

It's only Sandy (2011) and above that I'd say are truly not worth upgrading. Sandy saved a hell of a lot of power over Nehalem.

Er, I've got a 2600k I've OC'd to about 4.5GHz. So I lied, I got my stuff summer of 2011. Sorry about that.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Intel CPU and Platform Discussion: BluRay Conglomerate and DRM Apologist Discussion

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
I really want that so I can have a rock-solid Linux host for real things and a high-powered gaming Windows client for, uh, gaming.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Perhaps it refers to :tinfoil:?

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Wintering Stinkbug posted:

I picked up an atom powered low end notebook today. When did atom stop being terrible?

Somewhat recently. It allowed for a deluge of great little net/not/notebooks, windows tablets, and transformers like my kickass Asus T100.

I've always loved the form factor and now it's even useable. God bless our strange, power efficient age.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Panty Saluter posted:

Boy, autism really is hard to live with.

I dunno, he seems pretty happy in the video. Plus he can afford to play with so much ram.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Welmu posted:

Cannonlake delayed 6 months, will launch in the latter half of 2017 according to Mr Krzanich.

Is this the same delay that brought us Kaby Lake and they're just late in reporting it?

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

PC LOAD LETTER posted:

That was the reason given but AMD has been able to do pretty well with 'regular' sockets for quite a long time now so I'm not sure how necessary it really was to do it.

Given AMD's current state and performance I'm not sure I'd say they were doing fine with their current sockets.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

BobHoward posted:

Nobody said the gains were ginormous. And while one can't say for certain what's going on without inside information, it's been rather obvious for many years that AMD can't afford to do everything it needs to do. What goes on the chopping block first? Minor features. Ones where it isn't immediate death-of-the-company type stuff if you don't have them. LGA is in that category.

For example, consider power delivery. It's not about whether you can deliver 150W at all. There's plenty of cross sectional area in those PGA pins, they can carry a lot of amps without melting. The real problem is that higher inductance pins cause greater transient response -- meaning that when your high performance CPU core has current demand spikes (they always do), it induces an unpleasantly large momentary voltage droop at the core. In other words the package causes problems with voltage regulation at the point of load, not reduced power.

This is a solvable problem: specify a slightly higher core voltage, far enough above the true minimum that the transients never endanger data integrity. But now your CPU is using more power, and you have to bring it back down under the TDP target somehow, and welp maybe it's time to cut frequency a bit...

There's dozens (maybe even hundreds) of minor things like this where, if taken alone, it's not a huge advantage for Intel, but the fact that Intel is able to do them all adds up to a substantial advantage.

Death by a thousand feature cuts?

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

kujeger posted:

then compare performance/watt, which has gotten a lot better?

I think the point is in reasonable raw day to day performance, a metric that most enthusiasts care far more about than saving pennies on an electrical bill.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Twerk from Home posted:

Performance / watt has enabled modern laptops to become what they are. It's pretty incredible, especially as gaming workloads have consistently remained 100% GPU limited.

Not arguing that, my t100 is great. But most enthusiasts do their heavy lifting on their towers, not on a more expensive, less able laptop.

So again, that's not the metric enthusiasts particularly care for.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Skandranon posted:

Physics is a bitch. Until we can figure out an entirely new mode of computing, we'll have to focus on parallelizing our work instead of just making one really fast core. Cheap space travel would be nice too, but it turns out, it is actually REALLY HARD.

Not arguing against that truth either.

go3 posted:

enthusiasts are pretty irrelevant so i'm not particularly sure why Intel should be catering to them

Or that.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Grundulum posted:

Basic question: I see the term IPC thrown around a lot here. That means "instructions per cycle", right? So a 25% gain in IPC means that even if clock speeds are equal one chip is still faster than the other?

It means a processor can handle that many instructions per cycle.

"Powerful" is vague as poo poo in computer terms; a processor with a higher IPC but less complex or efficient instructions won't beat a processor with a bit less IPC but more powerful or efficient instructions.

Also, there's a sort of irony in dissing on the word "powerful" then using it immediately afterward :negative:

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Anime Schoolgirl posted:

:aaaaa: Who the gently caress thought that was a good idea for anything you can just plug in?

DMA is definitely not new or useless when properly implemented.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Nintendo Kid posted:

No they really don't. They're very robust designs. What the hell have you done with your microUSB ports?

Most of my friends and family have encountered this, and several of my cables just don't work without pressure despite decent treatment. Anecdotal, sure, but worrisome.

Usually it's the cable that goes bad, but a friend and my father just had to get new phones due to shoddy connections in the port.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Don Lapre posted:

Just use wireless charging and stop plugging poo poo in.

I would but I don't know if this hundred dollar sticker and charger will work in my otterbox.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
I think unironically that a phone running full Windows would not be a terrible device and, with a little tweaking, would be awesome.

My Windows Tablet is pretty sweet, and I already use a Note 3 which is practically a tablet I can put in my pocket. If my desktop fit in my pocket too...

Adbot
ADBOT LOVES YOU

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Remember when trying to future proof a computer was a fool's errand?

What a weird time we live in.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply