Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

redeyes posted:

I early adopted a 8700K and it cost $300. Seems like I saved around $50 bux.

:same::hfive:

It’s a nice CPU and I haven’t even really messed around with an overclock other than the asrock push button tool.

Adbot
ADBOT LOVES YOU

craig588
Nov 19, 2005

by Nyc_Tattoo
I took a gamble and got a 5820K. It really paid off because apps I use are just starting to use 6 cores and 5960Xs are getting cheap and I'll get one of those once I start using apps that use 8 cores. Broadwells are still expensive but they overclock poorly and end up costing like 200$ more for the same performance. 5960Xs can be had for as little as 300$ now.

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

Cygni posted:

Intel finished with a record year across the board, and projects 2019 to be another record. 4th quarter basically matched the record 3rd quarter ($18.7B vs $19.2B).

And they'll continue to be profitable if they can continue to provide Apple LTE and 5g modems so apple can dump Qualcomm.

canyoneer
Sep 13, 2005


I only have canyoneyes for you
Qualcomm is gonna make all those sweet 5G infrastructure bucks because Huawei

track day bro!
Feb 17, 2005

#essereFerrari
Grimey Drawer

craig588 posted:

I took a gamble and got a 5820K. It really paid off because apps I use are just starting to use 6 cores and 5960Xs are getting cheap and I'll get one of those once I start using apps that use 8 cores. Broadwells are still expensive but they overclock poorly and end up costing like 200$ more for the same performance. 5960Xs can be had for as little as 300$ now.

I'm in the same situation, I keep looking at 6900K's and 5960x's but does broadwell-e really OC worse? I did also read that the Haswell-E equivalent Xeons also have unlocked multipliers but thats just going of some forum post.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

track day bro! posted:

I'm in the same situation, I keep looking at 6900K's and 5960x's but does broadwell-e really OC worse? I did also read that the Haswell-E equivalent Xeons also have unlocked multipliers but thats just going of some forum post.

I very much recall people at the time saying broadwell didn't overclock as well as haswell.

track day bro!
Feb 17, 2005

#essereFerrari
Grimey Drawer

HalloKitty posted:

I very much recall people at the time saying broadwell didn't overclock as well as haswell.

That's a shame I was hoping I could get slightly less power hungry (lol at worrying about power usage running a HEDT chip and a Vega64) cpu with a broadwell-e one

craig588
Nov 19, 2005

by Nyc_Tattoo
Silicon Lottery had a tray of 5960xs and 6900ks and the 6900ks overclocked much worse https://siliconlottery.com/pages/statistics

track day bro!
Feb 17, 2005

#essereFerrari
Grimey Drawer

craig588 posted:

Silicon Lottery had a tray of 5960xs and 6900ks and the 6900ks overclocked much worse https://siliconlottery.com/pages/statistics

Wow only 35% are able to do 4.4ghz which is what I have my 5820K clocked to at around 1.100vcore

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.
Looks like the embargo lifted on the Xeon W-3175X (the unlocked 28-core CPU). Here's Gamers Nexus's review video:

https://www.youtube.com/watch?v=N29jTOjBZrw

And here's Der8auer delidding his review sample:

https://www.youtube.com/watch?v=aD9B-uu8At8

eames
May 9, 2009

The most amusing thing to me is that it seems to be pretty reasonable value if you have a use for all these cores — $3000 for 28 cores and 38,5MB cache. The equivalent Xeon 8180 is >$10000.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Mirrors the 32C TR2 pricing, doesn't it? I wholly expect AMD to screw them with the 32C TR3.

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.

eames posted:

The most amusing thing to me is that it seems to be pretty reasonable value if you have a use for all these cores — $3000 for 28 cores and 38,5MB cache. The equivalent Xeon 8180 is >$10000.

It will be interesting to see how AMD responds to this. I bet we will see a 64-core Threadripper SKU, probably for the same price as the Xeon (or maybe slightly higher since AMD can argue the overall system cost will still be less due to socket TR4 motherboards being much cheaper). Since we can probably expect it to hit the same clocks as the Xeon part (at least stock vs stock) and we know Zen 2 has full AVX2 support that would really only leave Intel with an advantage in memory channels for the Xeon.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Combat Pretzel posted:

Mirrors the 32C TR2 pricing, doesn't it? I wholly expect AMD to screw them with the 32C TR3.

2990WX is ~$1700.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

Combat Pretzel posted:

Mirrors the 32C TR2 pricing, doesn't it? I wholly expect AMD to screw them with the 32C TR3.

Steve also mentioned in the Gamers Nexus video that the Asus W-3175X motherboard is estimated to cost around ~$1700, with retail availability currently unknown.

sauer kraut
Oct 2, 2004
GN has gotten off track lately with crap like ^^that, LN cooling and collaborating with Roman.
I mean yeah he needs to do a video about every day to put food on the table, but c'mon.

BurritoJustice
Oct 9, 2012

sauer kraut posted:

GN has gotten off track lately with crap like ^^that, LN cooling and collaborating with Roman.
I mean yeah he needs to do a video about every day to put food on the table, but c'mon.

What?

Do you just not like extreme OC or do you think it's poor quality content? Is this partially the weird hateboner against Debauer this thread has had in the past?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
GN/Steve's doing what he's doing because like it or not, clickbait pays the bills. That's why they're stoking these bullshit and wholly-over-the-top ~benchmark rivalries~ between the tech streamers all of a sudden.

If anything, it leaves the "educating the masses about computer tech" field open for newer blood.

Cygni
Nov 12, 2005

raring to post

What else is he gonna do at the moment, review H310 boards? He’s gotta fill the space between major releases with something.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
Good thing I know Chinese and can watch better quality reviews elsewhere from China despite having much smaller budgets than big time Youtubers

sauer kraut
Oct 2, 2004

Cygni posted:

What else is he gonna do at the moment, review H310 boards? He’s gotta fill the space between major releases with something.

Stay on top of recent GPU driver releases and their issues, best gaming mouse/keyboard or headsets, sexy cases, in depth OC/undervolting guides for different mobo vendors.
Not putting a 28 core Xenon under a chilled waterblock and call yourself a gaming channel, the Canadian Clown does that kind of stuff better.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
If you want some solid content with unbelievable effort put into benchmarking, Hardware Unboxed with Aussie Steve is always good.

orcane
Jun 13, 2012

Fun Shoe

sauer kraut posted:

Stay on top of recent GPU driver releases and their issues, best gaming mouse/keyboard or headsets, sexy cases, in depth OC/undervolting guides for different mobo vendors.
Not putting a 28 core Xenon under a chilled waterblock and call yourself a gaming channel, the Canadian Clown does that kind of stuff better.

:goonsay:

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

HalloKitty posted:

If you want some solid content with unbelievable effort put into benchmarking, Hardware Unboxed with Aussie Steve is always good.

Hardware Unboxed has a criminally low amount of subscribers for its quality, it's crazy

At one point the dude did 300~ benchmarks for a single video

edit: 342 benchmark runs for this particular video - he was banned by EA at one point because they detected so many hardware changes

Zedsdeadbaby fucked around with this message at 13:18 on Jan 31, 2019

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Zedsdeadbaby posted:

he was banned by EA at one point because they detected so many hardware changes

Actually Hardcore Benchmarking

Cygni
Nov 12, 2005

raring to post

sauer kraut posted:

Stay on top of recent GPU driver releases and their issues, best gaming mouse/keyboard or headsets, sexy cases, in depth OC/undervolting guides for different mobo vendors.
Not putting a 28 core Xenon under a chilled waterblock and call yourself a gaming channel, the Canadian Clown does that kind of stuff better.

I would much rather watch the Xeon thing than driver errata showcase or BIOS simulators, but to each their own I guess.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

HalloKitty posted:

If you want some solid content with unbelievable effort put into benchmarking, Hardware Unboxed with Aussie Steve is always good.

Yeah on the subject of benchmarking effort, this vid they did a few weeks back is very useful.

https://www.youtube.com/watch?v=nE_zW5SKPic&t=5s

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

sauer kraut posted:

GN has gotten off track lately with crap like ^^that, LN cooling and collaborating with Roman.
I mean yeah he needs to do a video about every day to put food on the table, but c'mon.

Hello.

Did you know you don't have to watch every video on a channel? It's possible to only click on those ones you're interested in.


Well, goodbye.

Cygni
Nov 12, 2005

raring to post

Itanium has officially been dead for a while now, but Intel has released the phase out plan. No further development, and the last parts will ship to HPE no later than summer 2021.

https://www.anandtech.com/show/13924/intel-to-discontinue-itanium-9700-kittson-processor-the-last-itaniums

It is kinda sad seeing non-embedded VLIW die like this. I remember Itanium being announced with magazine articles talking about how VLIW would avenge the i860 and iAPX 432 and revolutionize computing forever. Not so much.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull
I'm not sad to see it go. That Itanium hype was always based on Intel management and marketing huffing the farts of the Itanium team and refusing to listen to the internal warnings from other departments that Itanium was likely to fall well short of promises. (Pages 85-92 are what you want to read.)

Stanley Pain
Jun 16, 2001

by Fluffdaddy

quote:

Anyway this chip architect
guy is standing up in front of this group promising the moon and stars. And I finally put my
hand up and said I just could not see how you're proposing to get to those kind of
performance levels. And he said well we've got a simulation, and I thought Ah, ok. That shut
me up for a little bit, but then something occurred to me and I interrupted him again. I said,
wait I am sorry to derail this meeting. But how would you use a simulator if you don't have a
compiler? He said, well that's true we don't have a compiler yet, so I hand assembled my
simulations. I asked "How did you do thousands of line of code that way?" He said “No, I did
30 lines of code”. Flabbergasted, I said, "You're predicting the entire future of this
architecture on 30 lines of hand generated code?" [chuckle], I said it just like that, I did not
mean to be insulting but I was just thunderstruck. Andy Grove piped up and said "we are not
here right now to reconsider the future of this effort, so let’s move on". I said "Okay, it's your
money, if that's what you want.

Cygni
Nov 12, 2005

raring to post

Its less caring about Itanium itself or Intel or its team or whatever and more caring about VLIW as a concept, to me. The concept seemed so promising, with EPIC and TerraScale and all that and now its all pretty much gone except in embedded coprocs and stuff. Sad.

movax
Aug 30, 2008


:staredog:

I was so hyped on Itanium back in the day. Morbidly curious what the workloads folks buying the Itanium boxes from HPE are doing.

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

movax posted:

:staredog:

I was so hyped on Itanium back in the day. Morbidly curious what the workloads folks buying the Itanium boxes from HPE are doing.

Some big iron financial software system/database thing that's still too expensive to backport to x86 after the millions spent getting it working in the shiny new itanium environment.

KKKLIP ART
Sep 3, 2004

So for those that really didn’t ever follow it, what was the big hype from Itanium to begin with and how did it fall flat on its face?

redeyes
Sep 14, 2002

by Fluffdaddy
It was something like with an extremely long instruction set, the compiler could do magical things and it turned out compilers were pretty dumb. Performance gains were never realized.

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

KKKLIP ART posted:

So for those that really didn’t ever follow it, what was the big hype from Itanium to begin with and how did it fall flat on its face?

It was supposed to have advanced features to make the pipeline more efficient and cut legacy garbage, but it turned out to actually not be more efficient for realistic workloads so why bother using it outside of special purpose applications.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Most CPUs used now days extract parallelism from code using hardware at run time, Itanium used what is called EPIC (explicitly parallel instruction computing) architecture where instead of extracting parallelism from serial code the code was explicitly parallel in that one instruction word would have multiple instructions encoded in it. This put the onus on the compiler to extract parallelism and it turns out that's really loving hard when running general purpose code. The concept works better with specialized things like graphics which are much easier to parallelize.

repiv
Aug 13, 2009

MaxxBot posted:

The concept works better with specialized things like graphics which are much easier to parallelize.

Hasn't VLIW been abandoned in graphics too though? Nvidia dropped it ages ago, AMD dropped it with GCN, maybe some weird mobile GPUs still use it.

Adbot
ADBOT LOVES YOU

crazypenguin
Mar 9, 2005
nothing witty here, move along

MaxxBot posted:

This put the onus on the compiler to extract parallelism and it turns out that's really loving hard when running general purpose code.

Just to expand on this remarkably succinct explanation (nice job): general purpose code branches a lot.

When it's the compiler's job to create big honking mega-instructions that explain how to use the processor's resources to their fullest potential, it needs to know what's being computed to do that. You can't cram more compute in, when you don't know what else needs to happen. When you have a branch that can go two different ways, it basically has to throw up its hands and go "idk sry." It can't really predict how the code will run. (And god help you if there's more than 2 different ways it can go!)

The CPU can do that dynamically just fine with speculation though. (Like, even considering recent security problems.) So modern CPUs just do this scheduling of instructions onto ALUs dynamically while running the code, mostly unimpeded by branching.

This whole failure is sometimes blamed on insufficiently smart compilers, and that's sort of true, but the truth is they designed an architecture that's bad at branching, wanted to run branch-heavy code on it, and said "this problem left to the compiler devs lol."

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply