Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

Tacier posted:

So I had a conversation with the IT guy at my office who buys only AMD processors and he seemed receptive to the idea that Intel might be better for our totally single threaded workloads, but still maintained that going AMD saves us money on RAM because Intel processors require you to use 3 sticks instead of two. I can't find anything to back up that claim, however.

Not only is this horseshit, but this is actually exactly the sort of thing you can use to turn management against him: how much do your GIS guys get paid per hour, and how many man-hours would legit faster computers save per year? If you have a bake-off and it shows the Intel system substantially outperforming the AMD system, you could immediately show your boss that this guy is costing the company thousands of dollars a year in lost productivity because he was (erroneously) worried about, what, forty bucks of RAM per computer?

Adbot
ADBOT LOVES YOU

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

JnnyThndrs posted:

I like the LGA setup, personally, because removal of the HSF won't yank the drat chip out of the socket half the time.

Thanks for reminding me of the horror I felt when I managed to do just that with a Socket 939 chip years ago.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

mayodreams posted:

None explicitly, but OEM's didn't give a poo poo about onboard audio 4-5 years ago, so newer boards have better chipsets and design to minimize distortion. My ASUS P8Z68-V Gen3 has AWFUL onboard sound that gives me noise even through the SPDIF out. I had to buy an ASUS Xonar DG that works pretty well, but Windows 10 is having some challenges with it. It is also PCI and I would just like to have a newer board with better onboard audio and be done with it.

Also, onboard audio works better for Hackintoshes. Right now I have a TurtleBeach MICRO II USB DAC that I use for Mac OS X and that is kind of annoying.

Wait, how are you getting noise from SPDIF? Shouldn't that be all digital until it hits your reciever?

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

BangersInMyKnickers posted:

It's going to be a while before most things scale well beyond 4 threads, I'd say Intel still has an edge in the near term. Get the K variant so you have the option for overclocking and stretching it down the road.

In my opinion by the time it's no longer adequate it will have been long enough that an upgrade won't be a big deal. I ran a Core 2 Duo from January 2009 all the way to mid-2014, when I bought a non-K i5 and I fully expect I'll get another couple years out of it at least. I'm happier to save a little money since I have no interest in overclocking.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
What's happening with higher resolutions that requires more CPU power in addition to more GPU power?

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

LRADIKAL posted:

Looking at the steam hardware survey, probably the "sweet spot" for relevance of gamers is between a GTX 970 and a 1070. Seems like a 1080 is a decent thing to benchmark as well. Past that, things start getting real "hardcore" niche. Adding to this, a vast majority of gaming takes place at resolutions at or below 1080p.


These cards make up 37 percent of the install base on steam (yes, China and poor countries, true.)

Hell, I've got a 1060 too. I can't believe I'm the only one who can afford to buy a beefier video card, but just refuses to pay over $300 on a video card out of stubbornness.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

suck my woke dick posted:

Yeah but those people are idiots.

That's why they're early adopters! :v:

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

K8.0 posted:

Intel can't produce enough high-end product and they are resorting to binning the hell out of everything to compensate. IDK if the KFs are just binned or if they're manufactured with a slightly different process that allows them to push things harder without as high of a failure rate but ultimately it's Intel coping with the problems brought on by betting so hard on 10nm and failing.

It may not just be high-end product, Dell and Lenovo have been slipping ship dates on regular desktops due to "part constraints" and what I've heard is Intel has been late on delivering CPUs.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
What's the intended use-case for that delightful oddity? I feel like any form factor a home user would stick it into, the sheer size of the cooling apparatus needed would mean they'd probably have the space for a larger case and motherboard anyway.


I feel like that's something you could build some kind of custom rig where you'd have a rack full of those things, being fed by a big central water cooling system or something.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

movax posted:

I think the demographics of this forum have mostly grown up past fanboy arguments of Intel vs AMD but there’ll always be a bit of crossover as was mentioned above.

Merging them is an interesting thought though...at least in the AMD thread recently, with the Ryzen launch, there’s been a ton of traffic talking about that platform, which would probably be difficult to intertwine with Intel chat also. Then again, we also have one GPU thread...

I feel like we’re in a weird valley where we need like 1.5 threads or something like that for Intel and AMD but open to hear your guys’ thoughts.

I sincerely think a combined x86 thread would be better and also sincerely want that thread to be immediately renamed to "x86: we literally cannot stop talking about Sandy Bridge" the instant someone starts yammering about how long they ran their 2500k or 2700k.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

Methylethylaldehyde posted:

Didn't we do the whole 'merge the threads' thing to the IT bitching threads like 2 years ago, and it was met with a resounding 'ehhhhhhh' and went back to 2 threads fairly quickly?

Why is that, anyway? And aren't there three IT whining threads?

More poo poo that pisses you off: My boss said I don't scream enough
[SPAM] FW: RE: I entered your meeting details and ended up in a sex chat
Working in IT 3.0: The Scrum Dumpster

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
What's the thermal density (is that the right term? or maybe just watts/cm2?) on current chips compared to past generations?

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
Recently got a notification from our Dell reps that some orders might be delayed in the coming months due to shipping delays from Intel.


eames posted:

That's why they strongly suggest disabling Hyperthreading which makes these attacks significantly faster and easier. Obviously that's a major performance/efficiency loss depending on the workload.

Heh, I don't even have hyperthreading on my Haswell i5. :smuggo:

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

Paul MaudDib posted:

Does B-die really hold any advantage over that new stuff (Micron E-die I believe)? The new stuff is a lot cheaper and seems to support some fast frequencies.

Okay, so I dropped out of following PC hardware for a while and I've been confused: what exactly is "b-die" and "e-die" and when did this concept emerge (or at least become relevant/important)?

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
How often does the "typical" enthusiast upgrade their CPU? Or I guess put another way, how many enthusiasts upgrade frequently enough that they can reuse the same motherboard?

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

sincx posted:

This isn't new by the way. ASRock did this decades ago.



That's Socket 939, right? God, it hurts to hear that referenced as "decades ago". :negative:

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
I still think the Intel and AMD threads should be collapsed into a single x86 thread.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

Zedsdeadbaby posted:

No, you can't sit with us. Get off the 14nm process first loser

lol I'm still on 22nm myself, some of us don't compulsively upgrade their computers

my next one's probably gonna be AMD though

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

Bofast posted:

Intel is still going to hurt a lot more in the server and laptop markets once the newest architecture actually gets put into Epyc and mobile Ryzen chips, and the more high end server sales they lose out on the less they can subsidize their lower end desktop chips.

There's a maximum limit to how much they can hurt, though; TSMC and Samsung have finite amounts of fabrication capacity, and even if every datacenter customer and laptop OEM decided they wanted AMD, a bunch of them would (may already) have to settle for Intel. And Intel is still selling chips as fast as they can make them; Dell's backlogged on laptop orders right now in part because they can't get enough chips from Intel.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

Cygni posted:

The short is yeah, in the current implementation AMD uses, there are latency penalties to chiplet layouts.

Does that apply to the single-CCX chips as well?

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
Someone earlier in the thread said that the processor real estate taken up by instruction decoding is not that big:

FunOne posted:

No, that's not really an issue for modern processors. Space is taken up by cache and VLIW style execution units inside each core. If you take a look at a processor image with a block diagram over it you'll see that decode is a very small segment of the die.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

Zedsdeadbaby posted:

We can only imagine if AMD decided to also juice the gently caress out of their CPUS, Intel may as well leave the game.

There's a finite amount of fab capacity in the world and Intel is still selling chips as fast as they can make them.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

Twerk from Home posted:

Are there any desktop chips coming like the mobile -U chips that have more efficiency cores than performance cores? It's weird to me that most of the desktop lineup has no E-cores, and they're reserved for only the higher end parts at huge TDPs.

On the mobile side, I see a whole line-up of things like the the Pentium 8505 and even lower end Celerons packing 5 cores now, and then by the time you're in the mid-range i5s the mobile parts have 10 cores, 2P + 8E. Why aren't we seeing mid or low priced desktop chips with a full host of E cores?

My guess is that Intel has finite fab capacity and is trying to ensure that they have enough of those -U chips to fulfill laptop OEM orders.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

evilweasel posted:

everyone copies jack welch because of how incredibly successful he was at GE at boosting stock prices by delivering consistent earnings

unfortunately everyone doesn't really realize that what made jack welch so successful at delivering consistent earnings was accounting fraud, which is why all of his acolytes crash and burn when they get their own companies

Did GE's pivot to being a financial services company happen during or after Welch's reign?

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

BobHoward posted:

Oh man there was a supermicro board in the lab once which did exactly this and it was infuriating

What I’ve run into a couple of times is I’ve built systems with Supermicro motherboards and Noctua fans, and the Noctuas will spin so slow that the fan speed falls below a critical threshold and causes the system to freak out and think one of the fans has failed entirely.

So it ramps up all the fans to full power, then goes back to normal because now the fans are no longer in a failure state… and the cycle begins again.

It’s generally possible to get in and adjust the thresholds, although I’ve also found that sometimes the fan spins slow enough that the board thinks it’s stopped entirely.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

HalloKitty posted:

240v circuits to home offices seems inevitable at this rate

when you say "home offices" are you talking about gamer dens? because i'm pretty sure the number of WFH positions that can't be serviced by a 15A 120V breaker is absolutely tiny

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
and even then this is still only impacting the upper, what, 1%? 0.1%? of all PC gamers

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

Methylethylaldehyde posted:

Tell that to half my damned users who smuggle a space heater under their desk.

that doesn't sound like a home office problem

Adbot
ADBOT LOVES YOU

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

Methylethylaldehyde posted:

You obviously don't have friends with a wife who is perpetually cold. My buddy's wife has four god damned space heaters under various desks and cubbies, specifically so her feet don't get cold when she's sitting there.

I kinda want a 20A/240 twistlock outlet for my computery poo poo, not because it needs it, but because twistlock is pretty great.

jesus christ how loving hot does that room get with four loving space heaters blasting :psyduck:



Canned Sunshine posted:

A lot of US tract home builders have multiple rooms (plus lighting) serviced by a single 15A breaker, so it's not quite that crazy to want 220/240v instead...

there's no way a tiny minority of people buying/building monster PCs are going to register in the minds of tract home builders looking to do the absolute minimum cheapest build

i'm not here trying to say "120V is superior to 240V" i just think it's ridiculous to think ultra top-end pc builds are going to make additional 240V residential circuits in the US "inevitable"

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply