|
I still doubt that part of the rumor, but if there is actual DLSS I'm curious how it's going to be cooled. DLSS isn't computationally trivial, that's going to be a significant amount of extra heat produced. Definitely will be a docked-only feature if it does exist - but as I said, probably BS.
|
# ? Mar 4, 2021 15:58 |
|
|
# ? Apr 26, 2024 15:12 |
|
could they stick the extra processing components (RT cores???) in the dock? if they could sell a 150 dollar dock that upgrades the existing switch install base that would be a killer feature, but I kind of doubt it from Nintendo tbh
|
# ? Mar 4, 2021 16:03 |
|
bus hustler posted:DLSS on the switch itself would rule, render Cyberpunk at 240p and upscale to 720p Thought DLSS required more base pixels than that to work with, so render at 720p and upscale to 1440p?
|
# ? Mar 4, 2021 16:16 |
|
v1ld posted:Thought DLSS required more base pixels than that to work with, so render at 720p and upscale to 1440p? https://www.reddit.com/r/nvidia/comments/g7rfzb/control_comparison_of_very_low_dlss_20/ you can do it or force it in Control at least.
|
# ? Mar 4, 2021 16:28 |
|
Neat! That grain is going to be far less discernible on the Switch's small screen.
|
# ? Mar 4, 2021 16:35 |
|
v1ld posted:Thought DLSS required more base pixels than that to work with, so render at 720p and upscale to 1440p?
|
# ? Mar 4, 2021 16:47 |
|
ufarn posted:The recent "Ultra Performance" mode for DLSS upscales a whole 9x. Obviously the result isn't amazing, but DLSS can do a lot of work, not to mention if it's set to dynamically change. So DLSS allows for dynamic resolution scaling on the input? For e.g., maintain a fixed 1080p output while changing the source resolution to match current render/compute constraints? My original question was about whether a certain minimum input resolution is needed to get good results from the temporal accumulation work it's doing to upscale. Seems like it gets reasonable, if grainy, results even at very low input resolutions.
|
# ? Mar 4, 2021 17:03 |
|
v1ld posted:So DLSS allows for dynamic resolution scaling on the input? For e.g., maintain a fixed 1080p output while changing the source resolution to match current render/compute constraints? DLSS 2.1 added the ability to vary the input resolution on the fly yes, but like VR support I don't think any games have shipped that feature yet
|
# ? Mar 4, 2021 17:04 |
|
repiv posted:DLSS 2.1 added the ability to vary the input resolution on the fly yes, but like VR support I don't think any games have shipped that feature yet Doesn't Cyberpunk have it? There's an auto DLSS setting that I thought targeted a specific framerate and changed variably to try and hit it
|
# ? Mar 4, 2021 17:06 |
|
That's what I thought it would be too, but IIRC that DLSS Auto mode just picks a fixed preset based on your resolution (e.g. Quality at 1080p, Performance at 1440p, etc)
|
# ? Mar 4, 2021 17:10 |
|
denereal visease posted:Thanks for this good post, my industry colleague. An idle musing more than anything, but I wonder what the fabs think about IDPR/DPR water as input to their process train. I have to imagine the facilities are doing additional water treatment at the fab after receiving water from the distribution system, or that they're connected to a raw water supply and do all treatment on-site. Thanks! Regarding IDPR/DPR, I think they will end up treating either just as they would any basic connection to a municipal water system. I don't yet have concrete information on what additional treatment(s) TSMC may be doing, but using Intel's (and others') fabs that are located here as a reference, they're connected to the actual distribution system, then have EQ tanks on-site that allow them to maintain continuous supply even during peak hour demand, unexpected waterline shutdowns, etc. From there, the water is further treated for actual process needs and any systems that utilize closed loops, but for anything that uses an open loop (such as heat exchangers), the distribution water is more than sufficient. One of the main issues seen is high TDS depending on the source: groundwater here is usually the "lowest" (though quite high by other standards in the US), while CAP can be pretty high and SRP can be astronomical depending on the time of year. Most of the plants can bring it down some via various treatment technologies that are used, but it's still not uncommon for CAP and SRP-sourced distribution systems to see TDS around 700 - 1000 ppm (a lot of times, the cities will blend with groundwater, which brings it down some). So most of the water-intensive industries in Phoenix that also have sensitivity to TDS, have to do some pre-treatment typically to at least bring down the TDS. The planning that I have seen for IDPR/DPR involves treating it to a water quality substantially above that of the distribution system (to demonstrate it's safe, get public approval, and more critically regulatory approval), but I think in most cases, it will then blend into the system, but if TSMC were able to directly consume any IDPR/DPR water, it would likely reduce their pre-treatment requirements.
|
# ? Mar 4, 2021 17:11 |
|
repiv posted:That's what I thought it would be too, but IIRC that DLSS Auto mode just picks a fixed preset based on your resolution (e.g. Quality at 1080p, Performance at 1440p, etc) That's really stupid but I guess I shouldn't have expected better from that game
|
# ? Mar 4, 2021 17:13 |
|
Cyberpunk's DLSS implementation at 1440p is really lackluster imho. Even at Quality I had to crank the sharpening filter way up just to get a semi-clean image. And certain screens like the inventory menu and V's reflections are still blurry as hell.
|
# ? Mar 4, 2021 17:16 |
|
exquisite tea posted:Cyberpunk's DLSS implementation at 1440p is really lackluster imho. Even at Quality I had to crank the sharpening filter way up just to get a semi-clean image. And certain screens like the inventory menu and V's reflections are still blurry as hell. Ok so that wasn't just me. I couldn't find a good DLSS setting for my 1440p ultrawide. Besides the above mentioned, road textures and things like elevator/door controls were noticeably low resolution for a few seconds before resolving clearly
|
# ? Mar 4, 2021 17:37 |
|
exquisite tea posted:Cyberpunk's DLSS implementation at 1440p is really lackluster imho. Even at Quality I had to crank the sharpening filter way up just to get a semi-clean image. And certain screens like the inventory menu and V's reflections are still blurry as hell.
|
# ? Mar 4, 2021 17:45 |
|
SourKraut posted:Thanks! Regarding IDPR/DPR, I think they will end up treating either just as they would any basic connection to a municipal water system. I don't yet have concrete information on what additional treatment(s) TSMC may be doing, but using Intel's (and others') fabs that are located here as a reference, they're connected to the actual distribution system, then have EQ tanks on-site that allow them to maintain continuous supply even during peak hour demand, unexpected waterline shutdowns, etc. From there, the water is further treated for actual process needs and any systems that utilize closed loops, but for anything that uses an open loop (such as heat exchangers), the distribution water is more than sufficient. SourKraut posted:One of the main issues seen is high TDS depending on the source: groundwater here is usually the "lowest" (though quite high by other standards in the US), while CAP can be pretty high and SRP can be astronomical depending on the time of year. Most of the plants can bring it down some via various treatment technologies that are used, but it's still not uncommon for CAP and SRP-sourced distribution systems to see TDS around 700 - 1000 ppm (a lot of times, the cities will blend with groundwater, which brings it down some). SourKraut posted:The planning that I have seen for IDPR/DPR involves treating it to a water quality substantially above that of the distribution system (to demonstrate it's safe, get public approval, and more critically regulatory approval), but I think in most cases, it will then blend into the system, but if TSMC were able to directly consume any IDPR/DPR water, it would likely reduce their pre-treatment requirements.
|
# ? Mar 4, 2021 18:23 |
|
exquisite tea posted:Cyberpunk's DLSS implementation at 1440p is really lackluster imho. Even at Quality I had to crank the sharpening filter way up just to get a semi-clean image. And certain screens like the inventory menu and V's reflections are still blurry as hell. It really doesn't help that the game is naturally incredibly blurry to begin with. Somebody managed to brute force the TAA off and it resulted in white spot artifacting everywhere.
|
# ? Mar 4, 2021 20:12 |
|
Gonna quote this from the switch thread. bull3964 posted:No, DLSS is pretty much an impossibility. bull3964 posted:No, DLSS is pretty much an impossibility.
|
# ? Mar 4, 2021 20:31 |
|
I'm woefully ignorant when it comes to how GPUs work, apologies, but would it be possible to put some for purpose upscaling device in the dock so it only works in that mode? I don't even know enough to know if that's a stupid question - it just would allow the existing device to scale up dramatically, if it's a thing that is possible.
|
# ? Mar 4, 2021 20:36 |
|
CoolCab posted:I'm woefully ignorant when it comes to how GPUs work, apologies, but would it be possible to put some for purpose upscaling device in the dock so it only works in that mode? I don't even know enough to know if that's a stupid question - it just would allow the existing device to scale up dramatically, if it's a thing that is possible. I’m sure it’s possible but because the switch is also a portable removing it from the dock while playing a game would cause issues. Also the price increase probably wouldn’t be worth it for them to do that. The switch is a pretty cheap device and they sure aren’t looking to increase that price point.
|
# ? Mar 4, 2021 20:39 |
|
Time to re-implement the old Gameboy on/off switch that locks the cart in place! Call it the Turbo Switch like the old Turbo buttons on PCs
|
# ? Mar 4, 2021 20:41 |
|
Any talk about a Switch Pro with cutting edge high detail graphics is completely ignoring Nintendo's entire history of hardware development. The only believable thing in the entire rumour is the OLED screen.
|
# ? Mar 4, 2021 20:45 |
|
MarcusSA posted:I’m sure it’s possible but because the switch is also a portable removing it from the dock while playing a game would cause issues. right, and ugh apologies if it's a little off topic, that's what sort of appeals to the marketing brain in me - nintendo would be able to sell the existing install base (which is massive) an upgrade at competitive price that could extend the lifetime of the device significantly. i feel like a moderate chunk of their users would probably buy a 200 dollar dock if it was able to make all your switch games 4k. as i understand it currently the switch's dock is more or less dumb - it offers the internals of the switch more power but otherwise just passes through and converts the signal to HDMI, connects your usb devices, etc. if it was possible to stick a GPU like device in there with some RTX cores and have it work the signal it passes through that might be a very competitive product. but, this is all pulled out of my butt.
|
# ? Mar 4, 2021 20:53 |
|
This chart is really old but I’m not quite sure the demand for a better docked mode is there. If they started to increase the price it’s gonna get closer to the series X or Ps5. It does seem as though Nvidia should be working on a new mobile chip for them but it doesn’t seem like they are.
|
# ? Mar 4, 2021 20:57 |
|
CoolCab posted:i understand it currently the switch's dock is more or less dumb - it offers the internals of the switch more power but otherwise just passes through and converts the signal to HDMI, connects your usb devices, etc. if it was possible to stick a GPU like device in there with some RTX cores and have it work the signal it passes through that might be a very competitive product. but, this is all pulled out of my butt. It has no power inside it, it's just a charging dock and HDMI passthrough. The Switch's clock speed increases when it's not running on battery power but that's it. And yeah, there's almost certainly not enough demand for better docked to justify the effort. The Switch Lite sold like crazy and most games seem to put most of their optimisation focus on running well undocked at this point, to the point where some games actually run worse when docked.
|
# ? Mar 4, 2021 21:01 |
|
njsykora posted:The Switch's clock speed increases when it's not running on battery power but that's it. I believe it only increases render and other targets when on wall power and plugged into an external display. Disappointing since it'd have been nice to get better visuals and smoother gameplay when in handheld mode but on external power - which for me was a portable USB C power brick.
|
# ? Mar 4, 2021 21:23 |
|
njsykora posted:Any talk about a Switch Pro with cutting edge high detail graphics is completely ignoring Nintendo's entire history of hardware development. The only believable thing in the entire rumour is the OLED screen. Agreed. People have been so horny for a Switch Pro that they are willing to recycle the same rumors for the last 3 years.
|
# ? Mar 4, 2021 21:46 |
|
MarcusSA posted:Gonna quote this from the switch thread. MarcusSA posted:This chart is really old but I’m not quite sure the demand for a better docked mode is there. NVIDIA already has Volta-based and Ampere-based Tegra chips either available or in development, I don't get that guy's argument at all. I would personally say the Volta based one (Xavier) looks more likely since the Ampere one (Orin) has quad 10GbE and a higher power budget, this new model is aimed a little higher end, plus it's newer and the timeline doesn't fit (sampling mid year, releasing next year) and Nintendo normally goes older. But Xavier looks plausible to me, I don't see why they couldn't use that. It released two years ago, is configurable down to 10W and has tensor cores so it could do DLSS 2.0. https://en.wikipedia.org/wiki/Tegra#Xavier Rumors say the AI upscaling may only be in docked mode which means a much higher available TDP. Regardless though, DLSS gives good speedups on current cards (2060 f.ex) even at 1080p resolution, it is an increase in performance-per-watt, and so far the theories about there being some lower bound where DLSS isn't worth it don't seem to actually be substantiated. Maybe lower in quality, yes, but Nintendo is a first party studio and can optimize their games to play nicely with the limitations, and their art style is not particularly high-detail so it's less likely to have glaring artifacts. So far it is only in the context of upscaling to 4K (1080p->4K) which is shown to work fairly well overall, there's enough pixels there. To be blunt, we already saw NVIDIA do a job posting for engineers to work on DLSS 2.0 in an unspecified console - you're welcome to tell me any other brand's console you think that might be. Nothing is confirmed until it's confirmed, but NVIDIA effectively already blew the lid that it's DLSS. Paul MaudDib fucked around with this message at 23:08 on Mar 4, 2021 |
# ? Mar 4, 2021 22:33 |
|
MarcusSA posted:Gonna quote this from the switch thread. I would take this quote with a huge grain of salt, since they didn't even bother to look up the current switch hardware specs before confidently declaring something impossible.
|
# ? Mar 4, 2021 22:55 |
|
DLSS aside, it's not CRAZY to think that Nintendo could include a high quality, low latency HDMI upscaler/image processor in the dock. I don't think it's particularly likely, but it could even be backwards compatible with the older version of the switch.
|
# ? Mar 4, 2021 23:06 |
|
Moly B. Denum posted:I would take this quote with a huge grain of salt, since they didn't even bother to look up the current switch hardware specs before confidently declaring something impossible. i think that's the joke, op they also insist on "tensor cores" being absolutely required for DLSS, which is patently silly. tensor cores might lighten the perf hit of running DLSS, but they're definitely not required since they don't do anything new, they're just specialized to do a thing better
|
# ? Mar 4, 2021 23:08 |
|
Notice it says "DLSS 4k TV mode", what are the odds this is something that only works in docked mode and the new dock has some sort of active cooling solution?
|
# ? Mar 4, 2021 23:28 |
|
Nintendo might not be big on hardware but I dunno, it could just be Nvidia rolling it into their next SOC by default. It’s clear that they see proprietary AI models over proprietary hardware as a value add - there’s a lot of potential demand for edge devices that can do AI inferencing. The tensor cores could actually potentially reduce TDP in a mobile setting as they’re typically used to crunch numbers at lower precision. It’s RT that’s the power hog. Rumors like this tend to be BS but the guy’s refutal isn’t particularly meaningful.
|
# ? Mar 4, 2021 23:45 |
|
This did make me realize I've had a switch for almost a year and I've never even taken the dock out of the box.
|
# ? Mar 4, 2021 23:48 |
|
A good 650W power supply is good enough for a Ryzen 7 5800x, EVGA RTX 3070 FTW3 Ultra and a 4-5 disks (1 nvme, 1-2 SSD, 2 700rpm) , right? Without the video card I would be certain it's good enough but I just got my video card and didn't have time to play with it enough before my PC died again so I'm unsure how much power I'll really pull with both the 3070 and the new Ryzen cpu. My 850W that I purchased before being able to get a Ampere card (I was planning for a 3080 so I upgraded the PSU but we all know how that went) is making me nervous... I got 2 power outages in 2 months and it killed my motherboard both times. I'm finally getting an UPS but I'm not sure that this 850W is good since I did have power problems in the past 3 years and it never killed my system. Only happened with my new 850W (EVGA 850 GO). The UPS should protect any surges and prevent the shutdown but the idea of keeping this new PSU feels wrong. So I ordered a new CPU and board as it doesn't look like I can RMA my RMA with Asus so I decided to upgrade by putting the price of a new board for my i7-8700k torwards a Ryzen build. My old 650W was rock solid so I'm thinking of not taking any chances and revert to that on my new build when I get the parts but after soome Googling, I'm not sure 650W is enough (minimum seems to be 550W, that's close) if I want to try overclocking the 3070 if it turns out to be stable and cool at stock. Any people here with similar builds can comment if my EVGA 650 G3 should be ok? It's nearing on 4 years but I'm pretty strapped for cash after the impulse purchases of the 3070 at scalper price and the now the new parts. I'm posting here because the factor I'm unsure of is the RTX 3070. Hemish fucked around with this message at 01:43 on Mar 5, 2021 |
# ? Mar 5, 2021 01:40 |
|
EVGA recommends a 650 at least. Eight cores and two physical drives are going to be pretty significant power draw - I dunno. you could undervolt it instead to keep it stable?
|
# ? Mar 5, 2021 01:50 |
|
https://videocardz.com/newz/nvidia-geforce-rtx-3080-ti-to-feature-cryptocurrency-mining-limiter Rumors of the 3080Ti: it's supposed to have the Ethereum mining nerf, and is supposed to be targeted for an April 2021 release. To me, this kinda suggests that they might not be releasing a version of existing SKUs with JUST the mining nerf, and instead are going to use the mining nerf to justify releasing refreshes sooner than later and even if the refresh is only marginally better than the card being replaced.
|
# ? Mar 5, 2021 01:52 |
|
LRADIKAL posted:DLSS aside, it's not CRAZY to think that Nintendo could include a high quality, low latency HDMI upscaler/image processor in the dock. I don't think it's particularly likely, but it could even be backwards compatible with the older version of the switch. Some sort of upscaler, sure. You can't have an actual DLSS post-processor that operates via an HDMI passthrough, though, because DLSS isn't just a post-processing step. It needs to be integrated fairly early into the rendering pipeline, and some types of effects and shaders have to be composited in at full resolution afterwards.
|
# ? Mar 5, 2021 01:55 |
|
DLSS is just branding and Nvidia already has an AI-based upsampler on their Shield devices that doesn't use temporal info
|
# ? Mar 5, 2021 01:56 |
|
|
# ? Apr 26, 2024 15:12 |
|
Hemish posted:A good 650W power supply is good enough for a Ryzen 7 5800x, EVGA RTX 3070 FTW3 Ultra and a 4-5 disks (1 nvme, 1-2 SSD, 2 700rpm) , right? the RTX 3070 FE stock tops out at 225 watts. Gamers Nexus was able to overclock it to draw 247 watts the Ryzen 7 5800X tops out at 127 watts that's 374 watts between the two parts - it seems like you'd have more than enough headroom for the rest with a 650 watt PSU and mind you these power tests are done with FurMark on the GPU and Blender on the CPU which are loads that you probably aren't hitting in a lot of cases
|
# ? Mar 5, 2021 01:59 |