|
It's still not clear what's going on with the new Xbox - Microsoft very specifically said it has hardware-accelerated raytracing, but AMD says RDNA doesn't have HW RT, so either MS is full of poo poo or their semi-custom Navi has RT bolted on. It wouldn't be the first time AMD has backported a feature like that, they spun a version of Polaris with 2xFP16 for the PS4 Pro when AMDs regular designs wouldn't get it until Vega. repiv fucked around with this message at 18:31 on Jun 16, 2019 |
# ? Jun 16, 2019 18:24 |
|
|
# ? Apr 29, 2024 05:16 |
|
Khorne posted:Shopping for a motherboard is so tedious. Yeah, now try shopping for an mITX board The various b450/x470 ITX boards are all a collection of various compromises, the rear IO is what annoys me the most. Listen, I've seen you assholes stick 10 or more USB ports on your ATX boards, what the hell is this 6 port garbage all about? Some of the new x570 boards are better on that front, but with the chipset fan (barf) and likely increased cost I'm probably gonna end up with an Asus Rog strix B450-I. Waiting for B550 boards to come out is just gonna be too drat long.
|
# ? Jun 16, 2019 18:26 |
|
repiv posted:It's still not clear what's going on with the new Xbox - Microsoft very specifically said it has hardware-accelerated raytracing, but AMD says RDNA doesn't have HW RT, so either MS is full of poo poo or their semi-custom Navi has RT bolted on. Or the Raytracing stuff wasn't ready for this years release but will be for next year when, coincidentally, the new consoles will launch.
|
# ? Jun 16, 2019 18:31 |
|
repiv posted:It's still not clear what's going on with the new Xbox - Microsoft very specifically said it has hardware-accelerated raytracing, but AMD says RDNA doesn't have HW RT, so either MS is full of poo poo or their semi-custom Navi has RT bolted on. I think they may have it bolted on. It's sort of a win/win. Microsoft gets to claim they have ray tracing to fire at Sony, and if it's successful, AMD gets to backport it. I also think the 20 series of cards are a bit too early for their own good, but on the other hand I think it's good that someone at least is trying something out and pushing things. Do I regret buying mine though? Nah. It has some serious power without the RT part. If the 1670 comes out ever, I might though! I've always been an Nvidia fangirl, and unless they screw up like Intel has, I doubt I'll switch. e: Drakhoran posted:Or the Raytracing stuff wasn't ready for this years release but will be for next year when, coincidentally, the new consoles will launch. Also probably this as well. iospace fucked around with this message at 18:35 on Jun 16, 2019 |
# ? Jun 16, 2019 18:32 |
|
Drakhoran posted:Or the Raytracing stuff wasn't ready for this years release but will be for next year when, coincidentally, the new consoles will launch. Yeah but the next-gen consoles silicon is almost certainly locked down at this point, so it's too late to use the 2020 architecture. If the Xbox Two does have HWRT then I bet it's like the PS4Pros weird half-step architecture, where it's fundamentally Navi 2019 but with that one feature cherry-picked from the 2020 arch. repiv fucked around with this message at 19:00 on Jun 16, 2019 |
# ? Jun 16, 2019 18:53 |
|
Almost certainly Navi 20. The Navi launched now really looks like a small part in both scope and silicon size.
|
# ? Jun 16, 2019 19:35 |
|
Sony announced the PS5 would have ray tracing before the MS press conference, so it seems like hardware support is coming for both.
|
# ? Jun 16, 2019 19:54 |
|
For all we know, it's just a token ability for buzzword's sake that isn't all that powerful.
|
# ? Jun 16, 2019 19:57 |
|
Rusty posted:Sony announced the PS5 would have ray tracing before the MS press conference, so it seems like hardware support is coming for both. Sony never said theirs is hardware accelerated though, so they might be "supporting" it in the same way they're "supporting" 8K and 120hz output. Technically possible but not at all practical.
|
# ? Jun 16, 2019 19:58 |
|
Stupid question, I'm using an 8600k at 5ghz right now, I do very VERY VERY large 70+mp raw conversion of medium format stuff, is jumping to a 3900x going to be a marked improvement from what I'm using now in CaptureOne?
|
# ? Jun 16, 2019 19:58 |
|
Yeah, who knows what it is, but seems like something AMD told both companies they could say they would be supporting.
|
# ? Jun 16, 2019 19:58 |
|
I agree with Khorne about being annoyed with motherboard shopping. Unless the post-release benchmarks turn out really badly, I'll be going with the 3900X for my CPU. I'm also looking at the X570 models, but it's really tough to compare them. My motherboard priorities are: - Compatibility - Enable overclocking - WiFi 6 and Bluetooth 5 - Allow very high RAM speed - Some futureproofing I also like the dual bios feature and having a better-than-1GB ethernet port. I've been looking at the Gigabyte Aorus Master since it seems to have all of that and more, but I feel like there's probably a $250 board that covers this ground as well that I haven't tracked down yet.
|
# ? Jun 16, 2019 21:08 |
|
Rusty posted:Sony announced the PS5 would have ray tracing before the MS press conference, so it seems like hardware support is coming for both. Ps have said everything that MS has said, much earlier. This time around there will be absolutely no differences to each machine (like the Xbone having that 32 Meg ESRam that the ps4 didn't.)
|
# ? Jun 16, 2019 21:28 |
|
Puddin posted:Ps have said everything that MS has said, much earlier.
|
# ? Jun 16, 2019 22:15 |
|
That's interesting. Are there any links for any of this stuff?
|
# ? Jun 16, 2019 22:48 |
|
It's just rumors at this point: https://twitter.com/Andrew_Reiner/status/1137833936682274816 It wouldn't surprise me, Microsoft always seems to take shortcuts on launch consoles for some reason. Rusty fucked around with this message at 23:07 on Jun 16, 2019 |
# ? Jun 16, 2019 23:04 |
|
Rusty posted:It's just rumors at this point: I mean, they can sell you the proper console at a premium later on!
|
# ? Jun 16, 2019 23:27 |
|
nerdrum posted:Stupid question, I'm using an 8600k at 5ghz right now, I do very VERY VERY large 70+mp raw conversion of medium format stuff, is jumping to a 3900x going to be a marked improvement from what I'm using now in CaptureOne? We'll find out when it's released and benchmarked next month.
|
# ? Jun 16, 2019 23:30 |
|
xbone being priced up because of the kinect was dumb as hell lol
|
# ? Jun 16, 2019 23:33 |
|
nerdrum posted:Stupid question, I'm using an 8600k at 5ghz right now, I do very VERY VERY large 70+mp raw conversion of medium format stuff, is jumping to a 3900x going to be a marked improvement from what I'm using now in CaptureOne?
|
# ? Jun 16, 2019 23:36 |
|
nerdrum posted:Stupid question, I'm using an 8600k at 5ghz right now, I do very VERY VERY large 70+mp raw conversion of medium format stuff, is jumping to a 3900x going to be a marked improvement from what I'm using now in CaptureOne? Harik posted:CaptureOne uses GPU acceleration so that specific part of your workflow probably won't change at all. Is it just GPU, or is it mixed GPU/CPU? I'd check CPU usage during conversion - if it's heavily loading all 6 of your cores, then you'd probably see a performance improvement. If it's only using a few cores (or just the GPU), then you won't see a difference
|
# ? Jun 16, 2019 23:55 |
|
Pcie4 drive could load the projects significantly quicker, probably.
|
# ? Jun 17, 2019 00:10 |
|
The console hardware stuff is interesting but I'm more interested in the architecture and economics of the cloud gaming stuff that Google and Microsoft are pushing these days.
|
# ? Jun 17, 2019 00:12 |
|
GRINDCORE MEGGIDO posted:Pcie4 drive could load the projects significantly quicker, probably.
|
# ? Jun 17, 2019 00:25 |
|
shrike82 posted:The console hardware stuff is interesting but I'm more interested in the architecture and economics of the cloud gaming stuff that Google and Microsoft are pushing these days. The economics are that they're going to sink enormous amounts of money into it and recoup none of it because cloud gaming is still a joke.
|
# ? Jun 17, 2019 00:44 |
|
iospace posted:Core dump of thoughts here: RTX 2060 is this. I have one paired with a 2700x and it comfortably plays everything at 60fps or higher at 1440p with high to ultra settings. (Crysis plays at 50FPS nearly maxed out because it’s dumb)
|
# ? Jun 17, 2019 00:56 |
|
Buildzoid has some really interesting analysis against buying X570 for Ryzen 3000 CPUs: https://www.youtube.com/watch?v=4JElrCCPDPA TL;DR cost effective gaming kits don't make sense, hardcore overclocking does make some sense, and there's not a ton of PCIe 4.0 stuff to take advantage of it. If you need high memory bandwidth or 12C+ CPUs, it makes more sense. I'm really excited to see what the updated Threadripper platfroms get us. e: lol had a different thing in the clipboard
|
# ? Jun 17, 2019 01:08 |
|
K8.0 posted:The economics are that they're going to sink enormous amounts of money into it and recoup none of it because cloud gaming is still a joke. Its one of those things I would like to see as an option one day. I was just talking about how I'd love to be able to have a small PC or laptop that I could just couch and play what would normally be intensive games that it wouldnt run. Bonus if there was a netflix type service (which I dont really ever see happening)
|
# ? Jun 17, 2019 01:11 |
|
Zen2 is good, x570 looks good but i am sitting this generation out because it just doesn't offer enough of an advantage for what i have to justify the extra cost of gen1 new stuff. plus i am looking for the magical 5.0ghz break point for 12c.
|
# ? Jun 17, 2019 01:38 |
|
SO DEMANDING posted:Yeah, now try shopping for an mITX board If having "only" six USB ports on the rear is the least-worst compromise you can find, seems to me it's time to just bite the bullet and grab a USB hub.
|
# ? Jun 17, 2019 01:41 |
|
SO DEMANDING posted:Yeah, now try shopping for an mITX board Farmer Crack-rear end posted:If having "only" six USB ports on the rear is the least-worst compromise you can find, seems to me it's time to just bite the bullet and grab a USB hub. OP post/username? I wouldn't hold my breath on eight-USB itx B550s, though. E: If you really need more USB ports running faster than a HUB, get a board with a 2x PCIe M.2 slots and use a M.2 to PCIe x4 riser to add a USB card. I think the ASRock X470 ITX board might also support bifurcation of the GPU x16 slot, which you could use to add USB ports via a riser. Stickman fucked around with this message at 01:57 on Jun 17, 2019 |
# ? Jun 17, 2019 01:55 |
|
NewFatMike posted:I'm really excited to see what the updated Threadripper platfroms get us. Me too. I've got a $650 CPU in this system as a "placeholder" for the zen2 update, and I was rather peeved when threadripper got unceremoniously removed from the roadmap in april. Now that it's confirmed again with a 64-core flagship rumored I'm deciding if I actually really need more than the 12c 2920x I'm running. I'm not really hurting for compute right now, so I may just get the bottom-rung zen2 chip (if it's 16c like people expect) for the better IPC, memory controller and boost clocks. Oh, and PCIe4 nvme. Which means I'll be selling the whole system and buying a new one.
|
# ? Jun 17, 2019 02:05 |
|
Farmer Crack-rear end posted:If having "only" six USB ports on the rear is the least-worst compromise you can find, seems to me it's time to just bite the bullet and grab a USB hub. Yeah it's really only an annoyance not a critical flaw. My current PC has lasted a long-rear end time and I expect my next one will as well, so I just want try to get everything perfect. I know, never gonna happen, just let me piss and moan on the internet, ok? Stickman posted:E: If you really need more USB ports running faster than a HUB, get a board with a 2x PCIe M.2 slots and use a M.2 to PCIe x4 riser to add a USB card. I think the ASRock X470 ITX board might also support bifurcation of the GPU x16 slot, which you could use to add USB ports via a riser. What in the good goddamn gently caress.
|
# ? Jun 17, 2019 02:11 |
|
I've decided that just bouncing my desktops down to NAS duty when it's time to upgrade is the way to go for me - if you've got core heavy workloads, maybe you'll still get mileage as an offloading server?
|
# ? Jun 17, 2019 02:11 |
|
Stickman posted:
There are self-contained m.2 USB controllers out there somewhere but I'll be damned if I can find anywhere to actually buy them. https://diarts-tech.com/product/2-port-internal-usb-3-1-gen-2-10gbs-m-2-card/
|
# ? Jun 17, 2019 03:02 |
|
I'm doing prep work for my All-Generations Ryzen Real-World Workload Numbers-O-Rama (Don't Call It A Benchmark). I think/hope it will be useful for people wanting to know how these processors perform when pushed hard, even if they don't understand what the workloads are (full disclosure: I don't completely understand what the workloads are either, and I can't find any information at all on the algorithms/packages being used by the MCM subproject). If nothing else these numbers are being generated by doing actual work, of various types, for extended periods of time, with the processors at maximum load. I'll won't mention it again until it's done, but i'm posting it once in its unfinished state because it might help some people who are agonizing over springing for a "lowly" 1600, to see what one is really capable of. (Spoiler: the 1600 is a crazy value right now, in terms of money and work it can do.) And I'm sure the 2X00s will be on fire-sale soon. Expect an update every 36-ish hours, as I process and add data for the ZIKA, FAH2, and MCM subprojects. https://firepear.net/grid/ryzen3900/ mdxi fucked around with this message at 04:58 on Jun 17, 2019 |
# ? Jun 17, 2019 04:55 |
|
FWIW der8auer dropped some hints in one of his recent videos, saying that he has been benching the new 12C/16C parts for a while and the power consumption when overclocked is “on a whole new level compared to last gen”. He implied that the huge VRMs are one of the main reasons why you would want a X570 board for OC. Not huge news but it seems like the CPUs will OC well, it just sounds like one will have to find a way to deal with 400-500W at the voltage/frequency limit.
|
# ? Jun 17, 2019 06:07 |
|
taqueso posted:Wake me up when I can get four 64-core chips on the same motherboard. The way they're using the PCIe lanes for interconnect I doubt you're going to see quad-socket on any AMD server platform ever e: now that I think about it with the move to pcie 4.0 they might be able to do it with 32-lanes per interconnect in a ring topology instead of the 64 on 3.0 so long as your workload is reasonably numa-aware. hmmmmmmmmm BangersInMyKnickers fucked around with this message at 14:31 on Jun 17, 2019 |
# ? Jun 17, 2019 14:07 |
|
eames posted:FWIW der8auer dropped some hints in one of his recent videos, saying that he has been benching the new 12C/16C parts for a while and the power consumption when overclocked is “on a whole new level compared to last gen”. He implied that the huge VRMs are one of the main reasons why you would want a X570 board for OC. Remember that der8auer's definition of "overclocking" is very different than most people's. NewFatMike posted:TL;DR cost effective gaming kits don't make sense, hardcore overclocking does make some sense, and there's not a ton of PCIe 4.0 stuff to take advantage of it. If you need high memory bandwidth or 12C+ CPUs, it makes more sense. So when you look at everything the chipset offers, they're not overpriced. But it's also the wrong time to buy a fancy expensive future-proof mobo because the socket is now at the end of its run. With both AMD and Zen proving themselves, the time to splurge will be x670 (or whatever) with a new AM5 Zen3 and DDR5.
|
# ? Jun 17, 2019 14:29 |
|
|
# ? Apr 29, 2024 05:16 |
|
But ddr5 might be expensive or only slow chips affordable. I remember the same thing around ddr4 launch with people picking ddr3 because it was faster and cheaper at that moment.
|
# ? Jun 17, 2019 15:09 |