Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Gamers were a mistake.

Adbot
ADBOT LOVES YOU

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

FuturePastNow posted:

I'm just imagining the little stepper motor seizing up and catching on fire someday.

It isn't likely to mechanically seize, but fun stepper motor fact: the motor drive circuit can choose to electromagnetically hold the spindle still, i.e. it resists spinning with great torque. It is in this state that the motor draws peak power, because it's just max DC current flowing through all the windings. In other words, lovely software can increase the likelihood of fire by commanding full-power stop.

(Unfortunately for our internal Beavis and Butthead, it's likely they cheaped out on the drive electronics and there's nothing which can supply enough amps to light anything on fire)

KYOON GRIFFEY JR
Apr 12, 2010



Runner-up, TRP Sack Race 2021/22
someone in the anandtech thread suggested that they power it with waste heat and how has nobody done this yet

GRINDCORE MEGGIDO
Feb 28, 1985


You can watch the cog spin as you enjoy the nerfed network port, I guess (Vs the 490 taichi)

wemgo
Feb 15, 2007
Rainbow LED pcs are out. Steampunk pcs are in.

18 months until someone links a reddit post of a PC that transfers power from the psu to the cpu and mobo with leather belt drives.

WhyteRyce
Dec 30, 2001

Icelake SP reviews dropped and all you guys can talk about is spinny poo poo.

https://www.anandtech.com/show/16594/intel-3rd-gen-xeon-scalable-review

I guess it isn’t that great so go back to talking about putting a grandfather clock in a case

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

WhyteRyce posted:

Icelake SP reviews dropped and all you guys can talk about is spinny poo poo.

https://www.anandtech.com/show/16594/intel-3rd-gen-xeon-scalable-review

I guess it isn’t that great so go back to talking about putting a grandfather clock in a case

I'm sad now because this is how Optane ends, not with a bang but with a whimper

WhyteRyce
Dec 30, 2001

If Intel cares about Optane as anything other than a way to lock people into buying Xeons then they should jump immediately onto CXL. Under BS and BK that was never going to happen but maybe Pat will have a different view

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!

movax posted:

Legit would consider a windowed case to show that off. It's dumb but I love it.

I would rather see more/better gimmick cases

Wild EEPROM
Jul 29, 2011


oh, my, god. Becky, look at her bitrate.
how many phases is the vrm that powers the asrock spinny gear

WhyteRyce
Dec 30, 2001

Also if you are interested in optane here is a quick review on the 200 series on Ice Lake

https://www.storagereview.com/review/intel-optane-persistent-memory-200-series-review-memverge

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Simplifying this all down... Is Intel Optane ever going to reasonable choice for consumers in the near future or is this just an enterprise play or for folks who are willing to shell out thousands for super high performance systems?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Crosby B. Alfred posted:

Simplifying this all down... Is Intel Optane ever going to reasonable choice for consumers in the near future or is this just an enterprise play or for folks who are willing to shell out thousands for super high performance systems?

In the near future? No, not unless Intel decides to do some big discontinuation blow-out sale.

WhyteRyce
Dec 30, 2001

It’s purely enterprise and even then it’s not for everyone. The ssds are so expensive compared to good enough nand except for specific cases like using an optane drive for logs or caching. And for dimms you’re locking yourself in to Xeon chips and probably the highest end skus of them knowing Intel. You’d have to need shitloads of memory or maybe have giant in memory databases or something to need it.

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

Yeah it’s a very thin market between large DIMMs and fast NVMe. There’s some cool things you could build with persistence theoretically but I’ve only seen one really impressive demo, but you could get pretty close with battery backed DIMM.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Crosby B. Alfred posted:

Simplifying this all down... Is Intel Optane ever going to reasonable choice for consumers in the near future or is this just an enterprise play or for folks who are willing to shell out thousands for super high performance systems?

At this point Intel has officially killed all consumer-facing Optane products. They still make server PCIe drives and the dimm NVM sticks but everything consumer facing (the 900P/905P and M.2 sticks) are all discontinued.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

WhyteRyce posted:

Icelake SP reviews dropped and all you guys can talk about is spinny poo poo.

https://www.anandtech.com/show/16594/intel-3rd-gen-xeon-scalable-review

I guess it isn’t that great so go back to talking about putting a grandfather clock in a case

I mean, we’re all mostly interested in the consumer market. These won’t even trickle down to surplus server rigs for another couple years, maybe some of us will get to play with them at work but that’s pretty boring / utilitarian. And even in the work sense, Epyc is just a much more exciting product.

Ice Lake would have been more relevant if it launched 2 years ago. It still would have lost to Rome but it gets dunked on by Milan. It’s a solid iteration architecturally but it’s of course nowhere near the kind of improvement you’d need to overcome AMD’s painless scaling across chiplets, which leaves AMD with a core count, cache, and cost advantage. And worst of all it’s not even on the new 10SF node, it’s still using the lovely second-gen 10nm so it clocks like trash and that offsets a lot of the IPC gains. Once again maybe relevant in 2019 but in 2021 Intel should really be offering a Tiger Lake-style superfin update.

I was gonna list “and no HEDT variant” as another reason consumers wouldn’t be interested, but... with those clock regressions, would anyone want it in a HEDT rig anyway? Increasingly that’s just a segment where Intel doesn’t even bother to compete, even with some aggressively mediocre prices (at times Epyc has literally been 10-20% cheaper than Threadripper) AMD pretty much dunks on Intel even in the value camp let alone the high end range. There’s a few niches where AMD doesn’t really cover well but it’s just some narrow gaps. Would like to maybe see them do something with the W-3175X as that’s the only “serious” competitor they have to Threadripper but even if you marked it down to something attractive (say $1500) the motherboards are still incredibly ridiculous. Some of it is small production volume of course but to get it into mass market you’d have to bring the mobo prices down by at least half even if the chip itself was reasonably priced.

The one advantage that Intel does have is that a monolithic die still does reduce power usage. The IO die is pulling 40% of the power in Milan and factoring in the power that gets used by the CCDs to push their side of the data it’s entirely possible that less than half the energy of the processor actually goes to actual computation (cores/cache). Of course it’s a somewhat smaller platform too, and monolithic cores do also come at a penalty to how much cache you can reasonably stuff onto a die, and at least to a point cache will reduce power utilization (cheaper to go to cache than out to memory, especially with the IO die configuration), as long as you’re not to the point where additional cache power outweighs the improvements to cache hit rates. But regardless in practice Intel is spending way less power on uncore, that much is obvious from anandtech.

That seems to be the fundamental limitation of chiplet designs. Every time a new limit gets found in processor performance scaling, there’s a period of stagnation, a new way is devised to get around the limitation, and performance scales up until there is some new bottleneck (often imposed by compromises from the solution to the previous bottleneck). Well, the limitation imposed by chiplets is the increase in power from having to communicate off-die, there is more power spent moving data around and eventually a new equilibrium gets reached. And IO doesn’t really scale with node shrinks. You can reduce the core/cache power itself but as you can see from Milan that’s actually where less than half the power of Milan is actually spent, you need big improvements to make small total gains since you’re only able to optimize a small part of the power budget (sort of an Amdahl’s Law situation). Even if you took the core power to literally zero you couldn’t do any better than doubling perf/watt without further optimizations to that IO power cost. Even if you are willing to go arbitrarily big on socket size/etc, chiplets don’t just allow magic indefinite scaling, like every Moore’s law style crisis it’s just a temporary reprieve until the next bottleneck rears its head.

Paul MaudDib fucked around with this message at 10:02 on Apr 7, 2021

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
How about Intel chips seeing further price cuts?

repiv
Aug 13, 2009

https://twitter.com/beesmygod_/status/1379940286927937540

https://twitter.com/dril/status/841892608788041732

Cygni
Nov 12, 2005

raring to post

this cannot be real

it cannot be

e: https://devmesh.intel.com/projects/bleep

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
Pretty much the best technology already exists, it’s called private discord server with only you and your friends.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Cygni posted:

this cannot be real

it cannot be

e: https://devmesh.intel.com/projects/bleep

PCMag posted:

https://www.pcmag.com/news/intel-bleep-software-filters-out-toxic-slurs-in-voice-chats-as-you-game

Intel originally debuted the concept for Bleep two years ago...Intel says it's preparing to release the program as a beta.

I had to confirm that the article was posted back in March before I was willing to concede this wasn't an April Fool's joke I'd just missed.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

priznat posted:

Pretty much the best technology already exists, it’s called private discord server with only you and your friends.

Warzone has a death mic feature where as soon as you kill somebody, their mic is activated and you get to hear what they say. It's a great feature since you get to hear that sweet sweet rage, or even just somebody saying "no way what a shot" or something. Plenty of times where it's just f**got or n**ger type of poo poo though.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

fletcher posted:

Warzone has a death mic feature where as soon as you kill somebody, their mic is activated and you get to hear what they say. It's a great feature since you get to hear that sweet sweet rage, or even just somebody saying "no way what a shot" or something. Plenty of times where it's just f**got or n**ger type of poo poo though.

Yeah that was cool, I could never get it going for me because I think I had my voice chat too locked down.. or perhaps it was I just sucked and rarely got kills :negative:

repiv
Aug 13, 2009

imagining a bunch of intel interns nervously mumbling the hard-r to build up the test corpus

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Imagine the writeup that someone had to do and send to HR.

"We would like approval to record and store and unlimited number of voice recordings of people saying the following words..."

And then sit there in a meeting and straight face insist that it isn't a joke while some poor 60yr old HR dude/lady sits there reading the list out for everyone.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I don't really know much of anything as to how this works but does AVX-512 support on Rocket Lake matter at all for CPU-based encoding for streaming purposes?

EDIT:

Also are there really no H510 motherboards yet?

gradenko_2000 fucked around with this message at 05:06 on Apr 8, 2021

FuturePastNow
May 19, 2014


Cygni posted:

this cannot be real

it cannot be

e: https://devmesh.intel.com/projects/bleep

Everyone who works for Intel should be fired

From the CEO on down to the receptionist

Start the company over from scratch

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Seems like a good use of AI/ML, better than 99% of poo poo we see. It must introducd a huge delay though since it needs to work on whole words, wonder how they works.

Fame Douglas
Nov 20, 2013

by Fluffdaddy
I really want to play with these racists, but can't stand to hear the actual words!

What is the target market for this? Seems entirely useless.

BlankSystemDaemon
Mar 13, 2009



The thing about any non-volatile memory technology is that there's simply not enough write endurance on it yet to replace DRAM.

Once you do replace DRAM you have to rewrite the entire operating system, because at that point there's no distinction between main memory and auxiliary memory.

Installation of the OS becomes a question of net-booting exactly once, and if you want to install software, you simply download it.
Naturally, this is not that much different from how Steam and things like Chocolatey works now, but you effectively remove all the need for Windows MSI installer, CAB files, and anything else.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

gradenko_2000 posted:

I don't really know much of anything as to how this works but does AVX-512 support on Rocket Lake matter at all for CPU-based encoding for streaming purposes?

potentially yes, but most streaming sites don't ingest video in anything except H264 and x264's speedups from AVX-512 aren't particularly great. how good it's gonna be depends on the exact motion search settings - testing at 1080p fast preset (let alone 720p fast preset) is :chloe: in a world where NVIDIA is beating the quality of x264 medium preset with a hardware encoder, the higher the resolution and the higher-quality the preset the more there is gonna be for AVX to sink its teeth into.

the speedups are much better in x265 but streaming sites don't use that.

(when comparing results bear in mind that Handbrake 1.0.x didn't support AVX-512 at all - they only updated their encoder libraries in 1.1.0, so a lot of the early Skylake-X benchmarks aren't really valid anymore. Actual x264/x265 encoder benchmarks themselves (run from the application directly) are valid but not when run on Handbrake's older instance.)

maybe x264 will be something that gets worked on more now that there's an AVX-512 mainstream platform in the wild - do bear in mind that up until now, you had clock regressions when using AVX-512, which would be problematic for games (and not good for overall performance in general), and it was only available on a handful of platforms. I wouldn't make purchasing decisions based on it, but maybe in 5 years the performance advantage will be somewhat higher once people have more of a chance to tune it.

in general though NVIDIA has pretty much solved the streaming problem. You have to use such a large amount of dedicated CPU power to beat Turing/Ampere's hardware encoder that it's just impractical to do in realtime while you're also running a game - they are beating "medium" quality so you need to be running slow, slower, or veryslow to even bother trying. I'd say you need at least 16 cores (8 for the game and 8 for encoding, or a dedicated encoding rig with 8+ cores) before it's even worth thinking about it.

edit: for a sense of perspective I ran a quick test, transcoding the first couple minutes of The Wire S01E01 1080p (so, 30fps live-action video), I'm getting 15fps at veryslow preset (you'll want to significantly beat the hardware encoding). Similar for some 60fps recordings I did of BF1 gameplay (rescaled down to 1080p), getting 15-ish fps. So at "veryslow" preset you are running at 1/2 of realtime at 1080p30, or 1/4 of realtime at 1080p60, on a 9900K with 8 cores, at about 80-85% CPU utilization. At "slow" preset (the very easiest thing that can beat the hardware encoder by any margin) I'm getting about 53 fps at 80% utilization, so you could do 1080p30 streaming with about 6 cores worth of utilization at minimum settings. You'd want to have 12 dedicated cores for encoding 1080p60. Add your game in on top of that CPU utilization.

Software video encoding is still worth it for archival-grade transcodes, where you want to squish it down once, with minimal artifacts, and store it in as a high a quality as possible as small as possible. For streaming it's just not worth the squeeze when you can get x264 medium-like quality basically for free. (Radeon owners need not apply.)

example:
code:
ffmpeg -i Battlefield\ 1\ 2021.04.01\ -\ 22.41.40.01.mp4 -filter:v scale=1080:1920 -c:v libx264 -preset veryslow -crf 27 -c:a copy bf1-test.mkv

Paul MaudDib fucked around with this message at 13:03 on Apr 8, 2021

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I'm not gonna pretend to understand all of that, but from what I can tell, what you're saying is:

* most sites, like say youtube and Twitch, are still using fairly old codecs (H264 / x264) that don't really benefit from AVX512
* a codec that does benefit well from AVX512, such as x265, hasn't really gotten wide adoption yet for streaming purposes
* Nvidia's NVENC is so good that in order to match it, you'd need to have a LOT of CPU power (as you said, 8 cores or more just for encoding, plus the game)

and so in conclusion it's not really useful to consider the value-add of Rocket Lake's AVX512 support in terms of using the CPU for encoding for streaming, because the most economical answer is still to use NVENC

am I reading you correctly?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

gradenko_2000 posted:

I'm not gonna pretend to understand all of that, but from what I can tell, what you're saying is:

* most sites, like say youtube and Twitch, are still using fairly old codecs (H264 / x264) that don't really benefit from AVX512
* a codec that does benefit well from AVX512, such as x265, hasn't really gotten wide adoption yet for streaming purposes
* Nvidia's NVENC is so good that in order to match it, you'd need to have a LOT of CPU power (as you said, 8 cores or more just for encoding, plus the game)

and so in conclusion it's not really useful to consider the value-add of Rocket Lake's AVX512 support in terms of using the CPU for encoding for streaming, because the most economical answer is still to use NVENC

am I reading you correctly?

correct, yup.

(terminology: x264 is a software encoder for the H264/MPEG4 AVC codec, x265 is a software encoder for the H265/HEVC codec, both codecs have hardware encoders available as well. And "fast", "medium", "slow", "veryslow", etc are presets that control "how hard" the encoder tries to find an optimal encoding for the video - "slower" presets are better quality because it's searching deeper into different ways it could encode the video to find the best one.)

H265 unfortunately is pretty much dead in a commercial sense due to patent encumberment. Kind of unfortunate because AMD actually does have a really good hardware H265 encoder, but there's very little use for "realtime" H265. "Realtime" usages (videoconferencing, streaming, etc) went H264, "archival" usages (like Youtube, etc) went either VP9, or are planning to move right on to AV1, which is the "next gen" archival codec, even slower with really high compression ratios. The big use-case for H265 is ironically storage for home plex servers, where it achieves really nice compression ratios at good quality, but that's the kind of thing where you don't use a hardware encoder, you use a CPU encoder and squish it down once in the maximum possible quality encode for your target bitrate - and AVX-512 does help there.

Maybe AV1 will benefit too, the encoding is really intensive there, like "seconds per frame" intensive (the point is for youtube to encode it once, and then they use less bandwidth every time someone views it, over millions of views it reduces their bandwidth costs so it will be worth it to do a really slow encode one time). That's the kind of thing where AVX might be able to get its teeth into it and provide some good speedups. People are working on hardware encoders but I don't think there are any out yet, just decoders (on Rocket Lake and Ampere). But it won't be for streaming.

anyway, yes, NVENC H264 really is phenomenal, basically it's still marginally worth it to have a dedicated encoding rig if you're a pro streamer who makes their living from it, as that way you remove absolutely any performance hit from the encoding, but it's just not worth the 6-12 core performance hit to beat it for most people. NVENC is not just good enough but actually good. AMD's H265 encoder is great too, they apparently made that for Google Stadia, who decided to pay the licensing costs to improve the ability to stream to more people on limited connections, on wifi, etc, but lol google, it's dead now.

Paul MaudDib fucked around with this message at 14:07 on Apr 8, 2021

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Fame Douglas posted:

I really want to play with these racists, but can't stand to hear the actual words!

What is the target market for this? Seems entirely useless.

Executives who want their rants bleeped out before they can be recorded by zoom.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Fame Douglas posted:

I really want to play with these racists, but can't stand to hear the actual words!

What is the target market for this? Seems entirely useless.

I'd assume this is less "me the viewer doesn't want to hear the bad words" and a lot more "me the streamer doesn't want to have to actually change any of my toxic behavior / vocabulary but also would prefer to not risk getting banned / demonetized on a platform for dropping racial slurs every 5 seconds."

And yeah, I'd assume that using it would introduce a few seconds delay to allow for processing, but that's fine for the vast majority of streamers as long as the system still keeps video/audio sync.

wet_goods
Jun 21, 2004

I'M BAAD!

mobby_6kl posted:

Seems like a good use of AI/ML, better than 99% of poo poo we see. It must introducd a huge delay though since it needs to work on whole words, wonder how they works.

All of the gamer words are collected ghostbusters style to be accidentally released in a catastrophic storm of hate in a few years.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
my z490 mobo has 2 display outputs hdmi and DP. Im running a 10700k. Can I power 2 monitors at once? Assuming yes, but I dont deal with onboard enough to know

Fame Douglas
Nov 20, 2013

by Fluffdaddy

Fauxtool posted:

my z490 mobo has 2 display outputs hdmi and DP. Im running a 10700k. Can I power 2 monitors at once? Assuming yes, but I dont deal with onboard enough to know

It can.

Adbot
ADBOT LOVES YOU

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

extremely good news. Not having a gpu might end up being fairly tolerable.

I think a reasonable use for bleep would be enabling it in games without telling anyone. You are free to be a racist bigot spewing hate but no one else can hear you and you are none the wiser. You just think people are ignoring you except for gameplay vital comms

Fauxtool fucked around with this message at 23:55 on Apr 8, 2021

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply