Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

SlayVus posted:

I just tried their CorePrio program on my 1950X and was running Indigo myself. Without CorePrio and without /affinity, I was getting 1.6 on Indigo. With just /affinity 0xFFFFFFFE I was getting 1.73 in Indigo(Launching Indigo with Core 0 affinity off). With CorePrio and no /affinity, I shot up to 2.2 in Indigo with my 1950x OC'd to 3.9GHz. So not the same 50% increase in performance he was, but I did see a ~37% increase in performance in just Indigo.
You running your 1950X in NUMA mode? --edit: Well, the issue will be solved this summer independently from Microsoft anyway, with that IO die bullshit.

SwissArmyDruid posted:

Bitch, if the use of ECC were important to you, you would be using it REGARDLESS of whatever file system you eventually arrive at.
Eh, I'd love to have it on the notion of my computer running 24/7, also hosting 3 permanent VMs, and wanting to avoid random bullshit due to bitflips (very rare unexplainable crashes, or data just getting dented every so slightly for resting in memory for too long). But 32GB of DDR4-2666 ECC is prohibitively expensive, more so if you want B-dies to apply fast timing to. Nevermind 64GB.

Combat Pretzel fucked around with this message at 23:15 on Jan 6, 2019

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo
You've just made my point. If it was more important to you than other aspects of your system, you'd have shelled out for it.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Nah, I refuse to pay that much of a premium over what should just be slightly more than a 12% mark up on the price.

--edit: Then again, I might have a wrong idea of memory pricing, because 32GB of non-ECC isn't that much cheaper currently. The difference between Corsair DDR4-2666 and Crucial DDR4-2666 ECC is around 15%. Goddamn. Latter has slower timings, tho.

Combat Pretzel fucked around with this message at 02:42 on Jan 7, 2019

NewFatMike
Jun 11, 2015

https://www.anandtech.com/show/13788/intels-unlocked-28-core-xeon-w-3175x-oem-tells-us-around-8k-usd

TL;DR Intel's 28C part could be priced anywhere from $4,000 to $8,000. In American dollars. I wonder if that includes the chiller.

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

NewFatMike posted:

https://www.anandtech.com/show/13788/intels-unlocked-28-core-xeon-w-3175x-oem-tells-us-around-8k-usd

TL;DR Intel's 28C part could be priced anywhere from $4,000 to $8,000. In American dollars. I wonder if that includes the chiller.

We both know it won't.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Nor one of these, which you will also need on top of the aquarium chiller to keep from going fusion.

http://c1940652.r52.cf0.rackcdn.com/54da7cc2ff2a7c55b6002386/TCU-promo.jpg

I used to work for the company that makes those, as an Intel contractor. Needless to say, that design looks nothing like what I used to work on, but it did involve sub-ambient cooling, heater blocks, and a peltier element.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
If AMD catches up on IPC and clock with the Zen 2 Threadrippers, and maintains the same price per core, there'd definitely be no point to go with that Intel contraption over a 24 or 32 core TR anymore. Especially if that price projection on that Intel is for real.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
When is AMD's CES Keynote again? :f5:

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.

spasticColon posted:

When is AMD's CES Keynote again? :f5:

Wednesday, 9 AM PST/noon EST.

monsterzero
May 12, 2002
-=TOPGUN=-
Boys who love airplanes :respek: Boys who love boys
Lipstick Apathy
Ugh, two days? I can't wait that long. I need to know if I'm going to pacing the halls until April waiting for a 3600(X).

Otakufag
Aug 23, 2004
I can't wait to be disappointed again by AMD.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Otakufag posted:

I can't wait to be disappointed again by AMD.

This is a refresh and die shrink and Intel has set the bar so low for gains between generations I'm not sure it's possible for AMD to disappoint. All they have to do is have 10% gains over Zen+ and that should be easy. So the only way for this to be true is for them to give a release date of late 2019 or 2020.

Unless you have completely unrealistic expectations and think they are hitting 5Ghz or going to take the crown for Intel. Both are possible, but if that is your expectation going in everything must disappoint you.

They'd have to gently caress up on a Bulldozer level, and Bulldozer was a design choice that didn't end up working and they needed to put it on market because it was all they had. Zen is a proven design at this point.

monsterzero
May 12, 2002
-=TOPGUN=-
Boys who love airplanes :respek: Boys who love boys
Lipstick Apathy
Yeah, after two years of waiting for intel to release something that offers a meaningful improvement (gaming) over my OC’d 3570k and doesn’t cost $400+ I’d like to think my expectations are tempered.

Really I just want a repeat of my 1GHz T-Bird vs PIII value proposition. Is 90% the performance at half the cost really too much to ask?

Ulio
Feb 17, 2011


Will AMD and Lisa Su unleash hell on Intel or on their own fans?

Anyway this will be one of the most anticipated keynotes in a while for consumers and investors.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord

Otakufag posted:

I can't wait to be disappointed again by AMD.

:same: but in the event I am not, Lisa take my money :retrogames:

ufarn
May 30, 2009
Bets on PCIe 4 motherboards?

avoid doorways
Jun 6, 2010

'twas brillig
Gun Saliva
They wouldn't be doing the keynote if they didn't have something

Right?

Inept
Jul 8, 2003

Licarn posted:

They wouldn't be doing the keynote if they didn't have something

Right?

RX595

Truga
May 4, 2014
Lipstick Apathy
it's gonna be the zen 2 lineup and a token gpu refresh

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
All I'm asking for is a line up and release date for Zen 2.

FlapYoJacks
Feb 12, 2009
If they offer a 3990wx with 64c/128T, even for 2k, I will put in a pre-order.

Cygni
Nov 12, 2005

raring to post

ufarn posted:

Bets on PCIe 4 motherboards?

The ones for Rome have already leaked. I dont think we will see it on consumer boards yet cause of the added difficulty in signal integrity adding cost, but it could happen i guess!

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Right outta the intel playbook.

Polaris++++++

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Nothing on the consumer side needs that kind of lane bandwidth. You're hard-pressed to find enterprise NVMe SSDs that can saturate 4x PCIe 3 lanes let alone 8 or 16 never mind the throughput doubling you get with 4.0 (and then some because they reduced protocol overhead or something).

It will be good for extremely high-end networking gear handling multiple 25/40/100gb links any maybe a few gpu compute configs could benefit but its largely a waste today, CPUs can't keep up with it.

SwissArmyDruid
Feb 14, 2014

by sebmojo
As has been said before, at this point, it is no longer a matter of "needs that kind of lane bandwidth" it is now become, "make what used to use 16x lanes use only 8x, or 8x use only 4x."

Cuz you just loving know that Intel's going choke the loving PCH at 2x PCIe 4.0, because they can, while reducing all the other available lanes for product segmentation purposes, instead of making them available to the user.

Gonna hang that TB4 controller off the PCH too, because that's a smart decision that hasn't come back to bite Intel in the rear end in terms of pushing Thunderbolt adoption at all, no sirree bob.

SwissArmyDruid fucked around with this message at 01:24 on Jan 8, 2019

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

I mean yes, but it actually seems that there is the possibility of a Vega refresh. Like, I don't know why because there is no way in hell even 7nm Vega will be exciting, and a 12nm Vega would be lolworthy.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
How much faster would a 595 be over a 295, like 20-25% after 5 years :razz:

crazypenguin
Mar 9, 2005
nothing witty here, move along

BangersInMyKnickers posted:

You're hard-pressed to find enterprise NVMe SSDs that can saturate 4x PCIe 3 lanes

Huh? The 3.5 GB/s sequential numbers from even ordinary consumer NVMe drives are doing exactly that.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Due to file system fragmentation, this'll drop a bit. Then because of decoding the data on the fly during loads in apps or games turns this to poo poo. And games with streaming geometry and textures have random access patterns, anyway.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Streaming high-res/speed video to the drive can pretty much saturate, I think. Large sequential writes make the FS overhead pretty much amortize away to nothing.

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

crazypenguin posted:

Huh? The 3.5 GB/s sequential numbers from even ordinary consumer NVMe drives are doing exactly that.

E: Well poo poo I had it in my head that 3.0 lanes were 16gbps

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib
Is the AGPU on the Ryzen 5 2400G good enough to drive two 4k monitors at 60 Hz and for everything to feel snappy (just like it would with a dedicated graphics card)? I'm running the same setup with an i7-7700T and it struggles pretty quickly - YouTube 4k videos, for example, start dropping frames as soon as other videos are open as well. Or am I better off getting a cheap graphics card as well.

NewFatMike
Jun 11, 2015

Lambert posted:

Is the AGPU on the Ryzen 5 2400G good enough to drive two 4k monitors at 60 Hz and for everything to feel snappy (just like it would with a dedicated graphics card)? I'm running the same setup with an i7-7700T and it struggles pretty quickly - YouTube 4k videos, for example, start dropping frames as soon as other videos are open as well. Or am I better off getting a cheap graphics card as well.

Should be fine from like a technical perspective, and it looks like the APU supports full VP9 hardware decode. Hopefully someone who has one will have first-have experience.

Not Wolverine
Jul 1, 2007

Klyith posted:

Disagree: it won't happen until compilers or perhaps the major engines start having really good, mostly-automated methods to create multiprocessing optimizations. Which is kinda like saying it won't happen without a magic wand.

Multi-threaded programming is real hard, games are not naturally suited for it, and studios have limited resources especially in the true programmer department. History has shown that just handing some new hardware to game developers does not result in them having the ability or resources to meaningfully take advantage of it.
Modern video games don't take advantage of multiple cores very well, but I think that is primarily because developers are simply targeting the lowest specs.

What would make multi-threaded game programming hard? Consider that a GTX 1080 has 2560 CUDA cores and 160 texture units, video game graphics appear to be taking advantage of multiple cores, granted that is handled by the graphics API. Why cant developers say put one core on loading and unloading data, a core or two on enemy AI, a core on physics, a core on lighting, etc. It seems like finding ways to split up the work load should be easy, actually programming that is likely difficult. The nice thing about the game industry is that there is software out there specifically to make programming easier like DirectX, OpenGL, Renderware (which made the PS2 awesome), or even just other game engines with the ability to take advantage of multiple cores built in.

I agree video game programmers can be pretty lazy sometimes, and I think that games will not take advantage of multiple cores any time soon, but I believe that is only because so many PCs out there are still just dual core machines and I believe PC gaming, or even just PC ownership, is becoming a dying trend.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
The latest Steam Hardware Survey reports that 57.20% of Steam users on Windows have 4 cores, 28.50% have 2 cores, 9.40% have 6, 1.89% have 3, 1.73% have 8, and 1.11% have a single core. No data reported on thread counts from hyperthreading in there, but that's a super majority with 4 or more cores.

Cygni
Nov 12, 2005

raring to post

Crotch Fruit posted:

I agree video game programmers can be pretty lazy sometimes, and I think that games will not take advantage of multiple cores any time soon, but I believe that is only because so many PCs out there are still just dual core machines and I believe PC gaming, or even just PC ownership, is becoming a dying trend.

Is this a copy paste from 6 years ago?

Dr. Fishopolis
Aug 31, 2004

ROBOT

Crotch Fruit posted:

I agree video game programmers can be pretty lazy sometimes, and I think that games will not take advantage of multiple cores any time soon, but I believe that is only because so many PCs out there are still just dual core machines and I believe PC gaming, or even just PC ownership, is becoming a dying trend.

on what information are you basing this opinion?

Kazinsal
Dec 13, 2011



Crotch Fruit posted:

I agree video game programmers can be pretty lazy sometimes, and I think that games will not take advantage of multiple cores any time soon, but I believe that is only because so many PCs out there are still just dual core machines and I believe PC gaming, or even just PC ownership, is becoming a dying trend.

Source your quotes

Klyith
Aug 3, 2007

GBS Pledge Week

Crotch Fruit posted:

What would make multi-threaded game programming hard? Consider that a GTX 1080 has 2560 CUDA cores and 160 texture units, video game graphics appear to be taking advantage of multiple cores, granted that is handled by the graphics API.

From a 100-miles-up view: graphics are easily parallelized because the pixel in the lower right corner doesn't care what color the pixel in the upper left is. Game state is not easily parallelized because when you want an NPC to open a door, that's AI code and physics code and world-geometry code that all care what state the others are in.

Sharing changeable, interdependent data across threads is a Hard Problem. The race condition is one of the most difficult bugs to find and fix. (If 1 happens before 2, you get 3. If 2 happens before 1, you get apple. Finding out why the gently caress there are apples is really hard if 2-then-1 only happens once in a million times.) The more you care about real-time performance, the harder this all is.

Adbot
ADBOT LOVES YOU

GRINDCORE MEGGIDO
Feb 28, 1985


How different is context switching in console os's Vs Windows?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply