Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

It's still not clear what's going on with the new Xbox - Microsoft very specifically said it has hardware-accelerated raytracing, but AMD says RDNA doesn't have HW RT, so either MS is full of poo poo or their semi-custom Navi has RT bolted on.

It wouldn't be the first time AMD has backported a feature like that, they spun a version of Polaris with 2xFP16 for the PS4 Pro when AMDs regular designs wouldn't get it until Vega.

repiv fucked around with this message at 18:31 on Jun 16, 2019

Adbot
ADBOT LOVES YOU

SO DEMANDING
Dec 27, 2003

Khorne posted:

Shopping for a motherboard is so tedious.

Yeah, now try shopping for an mITX board :shepicide:

The various b450/x470 ITX boards are all a collection of various compromises, the rear IO is what annoys me the most. Listen, I've seen you assholes stick 10 or more USB ports on your ATX boards, what the hell is this 6 port garbage all about?

Some of the new x570 boards are better on that front, but with the chipset fan (barf) and likely increased cost I'm probably gonna end up with an Asus Rog strix B450-I. Waiting for B550 boards to come out is just gonna be too drat long.

Drakhoran
Oct 21, 2012

repiv posted:

It's still not clear what's going on with the new Xbox - Microsoft very specifically said it has hardware-accelerated raytracing, but AMD says RDNA doesn't have HW RT, so either MS is full of poo poo or their semi-custom Navi has RT bolted on.

Or the Raytracing stuff wasn't ready for this years release but will be for next year when, coincidentally, the new consoles will launch.

iospace
Jan 19, 2038


repiv posted:

It's still not clear what's going on with the new Xbox - Microsoft very specifically said it has hardware-accelerated raytracing, but AMD says RDNA doesn't have HW RT, so either MS is full of poo poo or their semi-custom Navi has RT bolted on.

It wouldn't be the first time AMD has backported a feature like that, they spun a version of Polaris with 2xFP16 for the PS4 Pro when AMDs regular designs wouldn't get it until Vega.

I think they may have it bolted on. It's sort of a win/win. Microsoft gets to claim they have ray tracing to fire at Sony, and if it's successful, AMD gets to backport it.

I also think the 20 series of cards are a bit too early for their own good, but on the other hand I think it's good that someone at least is trying something out and pushing things.

Do I regret buying mine though? Nah. It has some serious power without the RT part. If the 1670 comes out ever, I might though!

I've always been an Nvidia fangirl, and unless they screw up like Intel has, I doubt I'll switch.

e:

Drakhoran posted:

Or the Raytracing stuff wasn't ready for this years release but will be for next year when, coincidentally, the new consoles will launch.

Also probably this as well.

iospace fucked around with this message at 18:35 on Jun 16, 2019

repiv
Aug 13, 2009

Drakhoran posted:

Or the Raytracing stuff wasn't ready for this years release but will be for next year when, coincidentally, the new consoles will launch.

Yeah but the next-gen consoles silicon is almost certainly locked down at this point, so it's too late to use the 2020 architecture. If the Xbox Two does have HWRT then I bet it's like the PS4Pros weird half-step architecture, where it's fundamentally Navi 2019 but with that one feature cherry-picked from the 2020 arch.

repiv fucked around with this message at 19:00 on Jun 16, 2019

Malcolm XML
Aug 8, 2009

I always knew it would end like this.
Almost certainly Navi 20. The Navi launched now really looks like a small part in both scope and silicon size.

Rusty
Sep 28, 2001
Dinosaur Gum
Sony announced the PS5 would have ray tracing before the MS press conference, so it seems like hardware support is coming for both.

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib
For all we know, it's just a token ability for buzzword's sake that isn't all that powerful.

repiv
Aug 13, 2009

Rusty posted:

Sony announced the PS5 would have ray tracing before the MS press conference, so it seems like hardware support is coming for both.

Sony never said theirs is hardware accelerated though, so they might be "supporting" it in the same way they're "supporting" 8K and 120hz output. Technically possible but not at all practical.

nerdrum
Aug 17, 2007

where am I
Stupid question, I'm using an 8600k at 5ghz right now, I do very VERY VERY large 70+mp raw conversion of medium format stuff, is jumping to a 3900x going to be a marked improvement from what I'm using now in CaptureOne?

Rusty
Sep 28, 2001
Dinosaur Gum
Yeah, who knows what it is, but seems like something AMD told both companies they could say they would be supporting.

surf rock
Aug 12, 2007

We need more women in STEM, and by that, I mean skateboarding, television, esports, and magic.
I agree with Khorne about being annoyed with motherboard shopping.

Unless the post-release benchmarks turn out really badly, I'll be going with the 3900X for my CPU. I'm also looking at the X570 models, but it's really tough to compare them.

My motherboard priorities are:

- Compatibility
- Enable overclocking
- WiFi 6 and Bluetooth 5
- Allow very high RAM speed
- Some futureproofing

I also like the dual bios feature and having a better-than-1GB ethernet port.

I've been looking at the Gigabyte Aorus Master since it seems to have all of that and more, but I feel like there's probably a $250 board that covers this ground as well that I haven't tracked down yet.

Puddin
Apr 9, 2004
Leave it to Brak

Rusty posted:

Sony announced the PS5 would have ray tracing before the MS press conference, so it seems like hardware support is coming for both.

Ps have said everything that MS has said, much earlier.

This time around there will be absolutely no differences to each machine (like the Xbone having that 32 Meg ESRam that the ps4 didn't.)

Rusty
Sep 28, 2001
Dinosaur Gum

Puddin posted:

Ps have said everything that MS has said, much earlier.

This time around there will be absolutely no differences to each machine (like the Xbone having that 32 Meg ESRam that the ps4 didn't.)
From what people have heard from devs, there are some differences in the dev kits in favor of Sony, but there isn't much in details.

Puddin
Apr 9, 2004
Leave it to Brak
That's interesting. Are there any links for any of this stuff?

Rusty
Sep 28, 2001
Dinosaur Gum
It's just rumors at this point:

https://twitter.com/Andrew_Reiner/status/1137833936682274816

It wouldn't surprise me, Microsoft always seems to take shortcuts on launch consoles for some reason.

Rusty fucked around with this message at 23:07 on Jun 16, 2019

iospace
Jan 19, 2038


Rusty posted:

It's just rumors at this point:

https://twitter.com/Andrew_Reiner/status/1137833936682274816

It wouldn't surprise me, Microsoft always seems to take shortcuts on launch consoles for some reason.

I mean, they can sell you the proper console at a premium later on!

NewFatMike
Jun 11, 2015

nerdrum posted:

Stupid question, I'm using an 8600k at 5ghz right now, I do very VERY VERY large 70+mp raw conversion of medium format stuff, is jumping to a 3900x going to be a marked improvement from what I'm using now in CaptureOne?

We'll find out when it's released and benchmarked next month.

Bulgakov
Mar 8, 2009


рукописи не горят

xbone being priced up because of the kinect was dumb as hell lol

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop

nerdrum posted:

Stupid question, I'm using an 8600k at 5ghz right now, I do very VERY VERY large 70+mp raw conversion of medium format stuff, is jumping to a 3900x going to be a marked improvement from what I'm using now in CaptureOne?
CaptureOne uses GPU acceleration so that specific part of your workflow probably won't change at all.

Stickman
Feb 1, 2004

nerdrum posted:

Stupid question, I'm using an 8600k at 5ghz right now, I do very VERY VERY large 70+mp raw conversion of medium format stuff, is jumping to a 3900x going to be a marked improvement from what I'm using now in CaptureOne?

Harik posted:

CaptureOne uses GPU acceleration so that specific part of your workflow probably won't change at all.

Is it just GPU, or is it mixed GPU/CPU? I'd check CPU usage during conversion - if it's heavily loading all 6 of your cores, then you'd probably see a performance improvement. If it's only using a few cores (or just the GPU), then you won't see a difference

GRINDCORE MEGGIDO
Feb 28, 1985


Pcie4 drive could load the projects significantly quicker, probably.

shrike82
Jun 11, 2005
Probation
Can't post for 2 hours!
The console hardware stuff is interesting but I'm more interested in the architecture and economics of the cloud gaming stuff that Google and Microsoft are pushing these days.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

GRINDCORE MEGGIDO posted:

Pcie4 drive could load the projects significantly quicker, probably.
Doubtful. 70MP pictures are like 140MB-ish, a regular currrent NVMe drive will be able to load that amount of data into memory in a pretty short time. Add potential filesystem fragmentation, the top speed advantage of PCIe4 does to poo poo, anyway. Most of the loading time is the CPU unpacking, debayering and tone-mapping the data.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

shrike82 posted:

The console hardware stuff is interesting but I'm more interested in the architecture and economics of the cloud gaming stuff that Google and Microsoft are pushing these days.

The economics are that they're going to sink enormous amounts of money into it and recoup none of it because cloud gaming is still a joke.

GutBomb
Jun 15, 2005

Dude?

iospace posted:

Core dump of thoughts here:

1. The problem with the GPU market is there's no 1440p60* or higher cards available at 400 USD. When I bought my 670, it was pretty much a 1080p60* card. It was also 400 MSRP. If either AMD or Nvidia release a card that fits that bill, it'll sell like mad.

RTX 2060 is this. I have one paired with a 2700x and it comfortably plays everything at 60fps or higher at 1440p with high to ultra settings.

(Crysis plays at 50FPS nearly maxed out because it’s dumb)

NewFatMike
Jun 11, 2015

Buildzoid has some really interesting analysis against buying X570 for Ryzen 3000 CPUs:

https://www.youtube.com/watch?v=4JElrCCPDPA

TL;DR cost effective gaming kits don't make sense, hardcore overclocking does make some sense, and there's not a ton of PCIe 4.0 stuff to take advantage of it. If you need high memory bandwidth or 12C+ CPUs, it makes more sense.

I'm really excited to see what the updated Threadripper platfroms get us.

e: lol had a different thing in the clipboard

KKKLIP ART
Sep 3, 2004

K8.0 posted:

The economics are that they're going to sink enormous amounts of money into it and recoup none of it because cloud gaming is still a joke.

Its one of those things I would like to see as an option one day. I was just talking about how I'd love to be able to have a small PC or laptop that I could just couch and play what would normally be intensive games that it wouldnt run. Bonus if there was a netflix type service (which I dont really ever see happening)

wargames
Mar 16, 2008

official yospos cat censor
Zen2 is good, x570 looks good but i am sitting this generation out because it just doesn't offer enough of an advantage for what i have to justify the extra cost of gen1 new stuff. plus i am looking for the magical 5.0ghz break point for 12c.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

SO DEMANDING posted:

Yeah, now try shopping for an mITX board :shepicide:

The various b450/x470 ITX boards are all a collection of various compromises, the rear IO is what annoys me the most. Listen, I've seen you assholes stick 10 or more USB ports on your ATX boards, what the hell is this 6 port garbage all about?

Some of the new x570 boards are better on that front, but with the chipset fan (barf) and likely increased cost I'm probably gonna end up with an Asus Rog strix B450-I. Waiting for B550 boards to come out is just gonna be too drat long.

If having "only" six USB ports on the rear is the least-worst compromise you can find, seems to me it's time to just bite the bullet and grab a USB hub.

Stickman
Feb 1, 2004

SO DEMANDING posted:

Yeah, now try shopping for an mITX board :shepicide:

The various b450/x470 ITX boards are all a collection of various compromises, the rear IO is what annoys me the most. Listen, I've seen you assholes stick 10 or more USB ports on your ATX boards, what the hell is this 6 port garbage all about?

Some of the new x570 boards are better on that front, but with the chipset fan (barf) and likely increased cost I'm probably gonna end up with an Asus Rog strix B450-I. Waiting for B550 boards to come out is just gonna be too drat long.

Farmer Crack-rear end posted:

If having "only" six USB ports on the rear is the least-worst compromise you can find, seems to me it's time to just bite the bullet and grab a USB hub.

OP post/username? I wouldn't hold my breath on eight-USB itx B550s, though.

E: If you really need more USB ports running faster than a HUB, get a board with a 2x PCIe M.2 slots and use a M.2 to PCIe x4 riser to add a USB card. I think the ASRock X470 ITX board might also support bifurcation of the GPU x16 slot, which you could use to add USB ports via a riser.

Stickman fucked around with this message at 01:57 on Jun 17, 2019

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop

NewFatMike posted:

I'm really excited to see what the updated Threadripper platfroms get us.

Me too. I've got a $650 CPU in this system as a "placeholder" for the zen2 update, and I was rather peeved when threadripper got unceremoniously removed from the roadmap in april. Now that it's confirmed again with a 64-core flagship rumored I'm deciding if I actually really need more than the 12c 2920x I'm running.

I'm not really hurting for compute right now, so I may just get the bottom-rung zen2 chip (if it's 16c like people expect) for the better IPC, memory controller and boost clocks.

Oh, and PCIe4 nvme. Which means I'll be selling the whole system and buying a new one. :(

SO DEMANDING
Dec 27, 2003

Farmer Crack-rear end posted:

If having "only" six USB ports on the rear is the least-worst compromise you can find, seems to me it's time to just bite the bullet and grab a USB hub.

Yeah it's really only an annoyance not a critical flaw. My current PC has lasted a long-rear end time and I expect my next one will as well, so I just want try to get everything perfect. I know, never gonna happen, just let me piss and moan on the internet, ok?

Stickman posted:

E: If you really need more USB ports running faster than a HUB, get a board with a 2x PCIe M.2 slots and use a M.2 to PCIe x4 riser to add a USB card. I think the ASRock X470 ITX board might also support bifurcation of the GPU x16 slot, which you could use to add USB ports via a riser.

What in the good goddamn gently caress.

NewFatMike
Jun 11, 2015

I've decided that just bouncing my desktops down to NAS duty when it's time to upgrade is the way to go for me - if you've got core heavy workloads, maybe you'll still get mileage as an offloading server?

repiv
Aug 13, 2009

Stickman posted:


E: If you really need more USB ports running faster than a HUB, get a board with a 2x PCIe M.2 slots and use a M.2 to PCIe x4 riser to add a USB card. I think the ASRock X470 ITX board might also support bifurcation of the GPU x16 slot, which you could use to add USB ports via a riser.

There are self-contained m.2 USB controllers out there somewhere but I'll be damned if I can find anywhere to actually buy them.

https://diarts-tech.com/product/2-port-internal-usb-3-1-gen-2-10gbs-m-2-card/

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

I'm doing prep work for my All-Generations Ryzen Real-World Workload Numbers-O-Rama (Don't Call It A Benchmark).

I think/hope it will be useful for people wanting to know how these processors perform when pushed hard, even if they don't understand what the workloads are (full disclosure: I don't completely understand what the workloads are either, and I can't find any information at all on the algorithms/packages being used by the MCM subproject). If nothing else these numbers are being generated by doing actual work, of various types, for extended periods of time, with the processors at maximum load.

I'll won't mention it again until it's done, but i'm posting it once in its unfinished state because it might help some people who are agonizing over springing for a "lowly" 1600, to see what one is really capable of. (Spoiler: the 1600 is a crazy value right now, in terms of money and work it can do.) And I'm sure the 2X00s will be on fire-sale soon. Expect an update every 36-ish hours, as I process and add data for the ZIKA, FAH2, and MCM subprojects. https://firepear.net/grid/ryzen3900/

mdxi fucked around with this message at 04:58 on Jun 17, 2019

eames
May 9, 2009

FWIW der8auer dropped some hints in one of his recent videos, saying that he has been benching the new 12C/16C parts for a while and the power consumption when overclocked is “on a whole new level compared to last gen”. He implied that the huge VRMs are one of the main reasons why you would want a X570 board for OC.
Not huge news but it seems like the CPUs will OC well, it just sounds like one will have to find a way to deal with 400-500W at the voltage/frequency limit.

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

taqueso posted:

Wake me up when I can get four 64-core chips on the same motherboard.

The way they're using the PCIe lanes for interconnect I doubt you're going to see quad-socket on any AMD server platform ever

e: now that I think about it with the move to pcie 4.0 they might be able to do it with 32-lanes per interconnect in a ring topology instead of the 64 on 3.0 so long as your workload is reasonably numa-aware. hmmmmmmmmm

BangersInMyKnickers fucked around with this message at 14:31 on Jun 17, 2019

Klyith
Aug 3, 2007

GBS Pledge Week

eames posted:

FWIW der8auer dropped some hints in one of his recent videos, saying that he has been benching the new 12C/16C parts for a while and the power consumption when overclocked is “on a whole new level compared to last gen”. He implied that the huge VRMs are one of the main reasons why you would want a X570 board for OC.
Not huge news but it seems like the CPUs will OC well, it just sounds like one will have to find a way to deal with 400-500W at the voltage/frequency limit.

Remember that der8auer's definition of "overclocking" is very different than most people's.


NewFatMike posted:

TL;DR cost effective gaming kits don't make sense, hardcore overclocking does make some sense, and there's not a ton of PCIe 4.0 stuff to take advantage of it. If you need high memory bandwidth or 12C+ CPUs, it makes more sense.

So when you look at everything the chipset offers, they're not overpriced. But it's also the wrong time to buy a fancy expensive future-proof mobo because the socket is now at the end of its run.

With both AMD and Zen proving themselves, the time to splurge will be x670 (or whatever) with a new AM5 Zen3 and DDR5.

Adbot
ADBOT LOVES YOU

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


But ddr5 might be expensive or only slow chips affordable. I remember the same thing around ddr4 launch with people picking ddr3 because it was faster and cheaper at that moment.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply