Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
I still doubt that part of the rumor, but if there is actual DLSS I'm curious how it's going to be cooled. DLSS isn't computationally trivial, that's going to be a significant amount of extra heat produced. Definitely will be a docked-only feature if it does exist - but as I said, probably BS.

Adbot
ADBOT LOVES YOU

CoolCab
Apr 17, 2005

glem
could they stick the extra processing components (RT cores???) in the dock? if they could sell a 150 dollar dock that upgrades the existing switch install base that would be a killer feature, but I kind of doubt it from Nintendo tbh

v1ld
Apr 16, 2012

bus hustler posted:

DLSS on the switch itself would rule, render Cyberpunk at 240p and upscale to 720p

Thought DLSS required more base pixels than that to work with, so render at 720p and upscale to 1440p?

bus hustler
Mar 14, 2019

v1ld posted:

Thought DLSS required more base pixels than that to work with, so render at 720p and upscale to 1440p?

https://www.reddit.com/r/nvidia/comments/g7rfzb/control_comparison_of_very_low_dlss_20/ you can do it or force it in Control at least.

v1ld
Apr 16, 2012

Neat! That grain is going to be far less discernible on the Switch's small screen.

ufarn
May 30, 2009

v1ld posted:

Thought DLSS required more base pixels than that to work with, so render at 720p and upscale to 1440p?
The recent "Ultra Performance" mode for DLSS upscales a whole 9x. Obviously the result isn't amazing, but DLSS can do a lot of work, not to mention if it's set to dynamically change.

v1ld
Apr 16, 2012

ufarn posted:

The recent "Ultra Performance" mode for DLSS upscales a whole 9x. Obviously the result isn't amazing, but DLSS can do a lot of work, not to mention if it's set to dynamically change.

So DLSS allows for dynamic resolution scaling on the input? For e.g., maintain a fixed 1080p output while changing the source resolution to match current render/compute constraints?

My original question was about whether a certain minimum input resolution is needed to get good results from the temporal accumulation work it's doing to upscale. Seems like it gets reasonable, if grainy, results even at very low input resolutions.

repiv
Aug 13, 2009

v1ld posted:

So DLSS allows for dynamic resolution scaling on the input? For e.g., maintain a fixed 1080p output while changing the source resolution to match current render/compute constraints?

DLSS 2.1 added the ability to vary the input resolution on the fly yes, but like VR support I don't think any games have shipped that feature yet

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

repiv posted:

DLSS 2.1 added the ability to vary the input resolution on the fly yes, but like VR support I don't think any games have shipped that feature yet

Doesn't Cyberpunk have it? There's an auto DLSS setting that I thought targeted a specific framerate and changed variably to try and hit it

repiv
Aug 13, 2009

That's what I thought it would be too, but IIRC that DLSS Auto mode just picks a fixed preset based on your resolution (e.g. Quality at 1080p, Performance at 1440p, etc)

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



denereal visease posted:

Thanks for this good post, my industry colleague. An idle musing more than anything, but I wonder what the fabs think about IDPR/DPR water as input to their process train. I have to imagine the facilities are doing additional water treatment at the fab after receiving water from the distribution system, or that they're connected to a raw water supply and do all treatment on-site.

Thanks! Regarding IDPR/DPR, I think they will end up treating either just as they would any basic connection to a municipal water system. I don't yet have concrete information on what additional treatment(s) TSMC may be doing, but using Intel's (and others') fabs that are located here as a reference, they're connected to the actual distribution system, then have EQ tanks on-site that allow them to maintain continuous supply even during peak hour demand, unexpected waterline shutdowns, etc. From there, the water is further treated for actual process needs and any systems that utilize closed loops, but for anything that uses an open loop (such as heat exchangers), the distribution water is more than sufficient.

One of the main issues seen is high TDS depending on the source: groundwater here is usually the "lowest" (though quite high by other standards in the US), while CAP can be pretty high and SRP can be astronomical depending on the time of year. Most of the plants can bring it down some via various treatment technologies that are used, but it's still not uncommon for CAP and SRP-sourced distribution systems to see TDS around 700 - 1000 ppm (a lot of times, the cities will blend with groundwater, which brings it down some).

So most of the water-intensive industries in Phoenix that also have sensitivity to TDS, have to do some pre-treatment typically to at least bring down the TDS.

The planning that I have seen for IDPR/DPR involves treating it to a water quality substantially above that of the distribution system (to demonstrate it's safe, get public approval, and more critically regulatory approval), but I think in most cases, it will then blend into the system, but if TSMC were able to directly consume any IDPR/DPR water, it would likely reduce their pre-treatment requirements.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

repiv posted:

That's what I thought it would be too, but IIRC that DLSS Auto mode just picks a fixed preset based on your resolution (e.g. Quality at 1080p, Performance at 1440p, etc)

That's really stupid but I guess I shouldn't have expected better from that game

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


Cyberpunk's DLSS implementation at 1440p is really lackluster imho. Even at Quality I had to crank the sharpening filter way up just to get a semi-clean image. And certain screens like the inventory menu and V's reflections are still blurry as hell.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

exquisite tea posted:

Cyberpunk's DLSS implementation at 1440p is really lackluster imho. Even at Quality I had to crank the sharpening filter way up just to get a semi-clean image. And certain screens like the inventory menu and V's reflections are still blurry as hell.

Ok so that wasn't just me. I couldn't find a good DLSS setting for my 1440p ultrawide. Besides the above mentioned, road textures and things like elevator/door controls were noticeably low resolution for a few seconds before resolving clearly

ufarn
May 30, 2009

exquisite tea posted:

Cyberpunk's DLSS implementation at 1440p is really lackluster imho. Even at Quality I had to crank the sharpening filter way up just to get a semi-clean image. And certain screens like the inventory menu and V's reflections are still blurry as hell.
To be fair, the base image of CP2077 looks like actual crap so DLSS probably doesn't have a lot to work with. The game looks incredibly weird without TAA on.

denereal visease
Nov 27, 2002

"Research your own experience. Absorb what is useful, reject what is useless, add what is essentially your own."

SourKraut posted:

Thanks! Regarding IDPR/DPR, I think they will end up treating either just as they would any basic connection to a municipal water system. I don't yet have concrete information on what additional treatment(s) TSMC may be doing, but using Intel's (and others') fabs that are located here as a reference, they're connected to the actual distribution system, then have EQ tanks on-site that allow them to maintain continuous supply even during peak hour demand, unexpected waterline shutdowns, etc. From there, the water is further treated for actual process needs and any systems that utilize closed loops, but for anything that uses an open loop (such as heat exchangers), the distribution water is more than sufficient.
This all makes sense, but I did appreciate the note on an EQ tank on-site; I generally don't follow the water past arrival at customers.

SourKraut posted:

One of the main issues seen is high TDS depending on the source: groundwater here is usually the "lowest" (though quite high by other standards in the US), while CAP can be pretty high and SRP can be astronomical depending on the time of year. Most of the plants can bring it down some via various treatment technologies that are used, but it's still not uncommon for CAP and SRP-sourced distribution systems to see TDS around 700 - 1000 ppm (a lot of times, the cities will blend with groundwater, which brings it down some).

So most of the water-intensive industries in Phoenix that also have sensitivity to TDS, have to do some pre-treatment typically to at least bring down the TDS.
That's really interesting to me that the groundwater source has the lowest TDS compared to the surface water sources, not what I would've guessed; but if CAP/SRP flows are highly runoff-sensitive I could see them being higher in TDS.

SourKraut posted:

The planning that I have seen for IDPR/DPR involves treating it to a water quality substantially above that of the distribution system (to demonstrate it's safe, get public approval, and more critically regulatory approval), but I think in most cases, it will then blend into the system, but if TSMC were able to directly consume any IDPR/DPR water, it would likely reduce their pre-treatment requirements.
Yup, everything I hear about utilities trying to advance IDPR/DPR has a huge public outreach/education component in addition to treating the water to elevated standards (I would agree that the fabs consuming IDPR/DPR directly would probably have a reduced on-site treatment need in this case).

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

exquisite tea posted:

Cyberpunk's DLSS implementation at 1440p is really lackluster imho. Even at Quality I had to crank the sharpening filter way up just to get a semi-clean image. And certain screens like the inventory menu and V's reflections are still blurry as hell.

It really doesn't help that the game is naturally incredibly blurry to begin with. Somebody managed to brute force the TAA off and it resulted in white spot artifacting everywhere.

MarcusSA
Sep 23, 2007


Gonna quote this from the switch thread.

bull3964 posted:

No, DLSS is pretty much an impossibility.

Look (this goes to everyone and not just this speculation). The current ShieldTV has a very good 1080p->4k upscale engine. It will likely work even better on 3d graphics. This is not DLSS which requires tensor cores which the X1 does not have. The AI Upscaling that the ShieldTV has is likely how they are going to get the 4k output. I'm not sure how much processing power the AI upscaling engine uses, but they can probably squeeze the extra performance out of the X1 from a redesigned thermal system and just increasing yields since it's been 2 years since they started making 16nm X1 parts

Also, the current switch has a peak TDP of 15w for compute and graphics. There's just no way in unholy hell that they are going to be able to include graphic features of a 90w-160w (graphics only) package.

And no, Nvidia isn't going to magic up some never before seen ground up silicon to toss into the switch that no one at all has gotten any whiff of at all especially when they haven't developed a mobile SoC (aside from the X1 die shrink) in like 5 years. Maybe, MAYBE, we'll see a X2 being used which will give us Pascal based graphics, but we're talking maybe like a 12% increase in capability.

The switch, as it stands is fairly dead end from a significant advances without ditching nvidia and going with some other arm chip supplier (likely Qualcomm or maybe mediatek). However, that would likely be a lot of work because I'm sure they are using nvidia specific instructions that would need some sort of middleware or emulation. I mean, I guess nVidia could jump back into a market they abandoned, but that's going to take quite the ramp up.

Also, the Switch is a dual core system (well, technically 4 cores, but the two low power cores aren't used.) Current mobile SoC are all optimized around 8 cores. So, again, they aren't really well suited to just drop in and replace the X1.

And this is all forgetting the fact that a ground up redesigned SoC will be 7nm or 5nm and good loving luck getting manufacturing capacity to produce them at scale right now. Qualcomm renamed their Snapdragon 865 from 2020 to something new so they could sell it in 2021 because they know they won't be able to make enough 5nm SD888.

When a true next gen switch is coming down the pipe we will know because there will be a very unmistakably curious new ARM SoC making the rounds 9-18 months in advance. Until then it's going to be continual tweaking of the X1 to provide marginal quality of life improvements to the existing system.

bull3964 posted:

No, DLSS is pretty much an impossibility.

Look (this goes to everyone and not just this speculation). The current ShieldTV has a very good 1080p->4k upscale engine. It will likely work even better on 3d graphics. This is not DLSS which requires tensor cores which the X1 does not have. The AI Upscaling that the ShieldTV has is likely how they are going to get the 4k output. I'm not sure how much processing power the AI upscaling engine uses, but they can probably squeeze the extra performance out of the X1 from a redesigned thermal system and just increasing yields since it's been 2 years since they started making 16nm X1 parts

Also, the current switch has a peak TDP of 15w for compute and graphics. There's just no way in unholy hell that they are going to be able to include graphic features of a 90w-160w (graphics only) package.

And no, Nvidia isn't going to magic up some never before seen ground up silicon to toss into the switch that no one at all has gotten any whiff of at all especially when they haven't developed a mobile SoC (aside from the X1 die shrink) in like 5 years. Maybe, MAYBE, we'll see a X2 being used which will give us Pascal based graphics, but we're talking maybe like a 12% increase in capability.

The switch, as it stands is fairly dead end from a significant advances without ditching nvidia and going with some other arm chip supplier (likely Qualcomm or maybe mediatek). However, that would likely be a lot of work because I'm sure they are using nvidia specific instructions that would need some sort of middleware or emulation. I mean, I guess nVidia could jump back into a market they abandoned, but that's going to take quite the ramp up.

Also, the Switch is a dual core system (well, technically 4 cores, but the two low power cores aren't used.) Current mobile SoC are all optimized around 8 cores. So, again, they aren't really well suited to just drop in and replace the X1.

And this is all forgetting the fact that a ground up redesigned SoC will be 7nm or 5nm and good loving luck getting manufacturing capacity to produce them at scale right now. Qualcomm renamed their Snapdragon 865 from 2020 to something new so they could sell it in 2021 because they know they won't be able to make enough 5nm SD888.

When a true next gen switch is coming down the pipe we will know because there will be a very unmistakably curious new ARM SoC making the rounds 9-18 months in advance. Until then it's going to be continual tweaking of the X1 to provide marginal quality of life improvements to the existing system.

CoolCab
Apr 17, 2005

glem
I'm woefully ignorant when it comes to how GPUs work, apologies, but would it be possible to put some for purpose upscaling device in the dock so it only works in that mode? I don't even know enough to know if that's a stupid question - it just would allow the existing device to scale up dramatically, if it's a thing that is possible.

MarcusSA
Sep 23, 2007

CoolCab posted:

I'm woefully ignorant when it comes to how GPUs work, apologies, but would it be possible to put some for purpose upscaling device in the dock so it only works in that mode? I don't even know enough to know if that's a stupid question - it just would allow the existing device to scale up dramatically, if it's a thing that is possible.

I’m sure it’s possible but because the switch is also a portable removing it from the dock while playing a game would cause issues.

Also the price increase probably wouldn’t be worth it for them to do that.

The switch is a pretty cheap device and they sure aren’t looking to increase that price point.

Nybble
Jun 28, 2008

praise chuck, raise heck
Time to re-implement the old Gameboy on/off switch that locks the cart in place! Call it the Turbo Switch like the old Turbo buttons on PCs

njsykora
Jan 23, 2012

Robots confuse squirrels.


Any talk about a Switch Pro with cutting edge high detail graphics is completely ignoring Nintendo's entire history of hardware development. The only believable thing in the entire rumour is the OLED screen.

CoolCab
Apr 17, 2005

glem

MarcusSA posted:

I’m sure it’s possible but because the switch is also a portable removing it from the dock while playing a game would cause issues.

Also the price increase probably wouldn’t be worth it for them to do that.

The switch is a pretty cheap device and they sure aren’t looking to increase that price point.

right, and ugh apologies if it's a little off topic, that's what sort of appeals to the marketing brain in me - nintendo would be able to sell the existing install base (which is massive) an upgrade at competitive price that could extend the lifetime of the device significantly. i feel like a moderate chunk of their users would probably buy a 200 dollar dock if it was able to make all your switch games 4k.

as i understand it currently the switch's dock is more or less dumb - it offers the internals of the switch more power but otherwise just passes through and converts the signal to HDMI, connects your usb devices, etc. if it was possible to stick a GPU like device in there with some RTX cores and have it work the signal it passes through that might be a very competitive product. but, this is all pulled out of my butt.

MarcusSA
Sep 23, 2007

This chart is really old but I’m not quite sure the demand for a better docked mode is there.



If they started to increase the price it’s gonna get closer to the series X or Ps5.

It does seem as though Nvidia should be working on a new mobile chip for them but it doesn’t seem like they are.

njsykora
Jan 23, 2012

Robots confuse squirrels.


CoolCab posted:

i understand it currently the switch's dock is more or less dumb - it offers the internals of the switch more power but otherwise just passes through and converts the signal to HDMI, connects your usb devices, etc. if it was possible to stick a GPU like device in there with some RTX cores and have it work the signal it passes through that might be a very competitive product. but, this is all pulled out of my butt.

It has no power inside it, it's just a charging dock and HDMI passthrough. The Switch's clock speed increases when it's not running on battery power but that's it.

And yeah, there's almost certainly not enough demand for better docked to justify the effort. The Switch Lite sold like crazy and most games seem to put most of their optimisation focus on running well undocked at this point, to the point where some games actually run worse when docked.

v1ld
Apr 16, 2012

njsykora posted:

The Switch's clock speed increases when it's not running on battery power but that's it.

I believe it only increases render and other targets when on wall power and plugged into an external display. Disappointing since it'd have been nice to get better visuals and smoother gameplay when in handheld mode but on external power - which for me was a portable USB C power brick.

Cygni
Nov 12, 2005

raring to post

njsykora posted:

Any talk about a Switch Pro with cutting edge high detail graphics is completely ignoring Nintendo's entire history of hardware development. The only believable thing in the entire rumour is the OLED screen.

Agreed. People have been so horny for a Switch Pro that they are willing to recycle the same rumors for the last 3 years.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

MarcusSA posted:

Gonna quote this from the switch thread.

MarcusSA posted:

This chart is really old but I’m not quite sure the demand for a better docked mode is there.



If they started to increase the price it’s gonna get closer to the series X or Ps5.

It does seem as though Nvidia should be working on a new mobile chip for them but it doesn’t seem like they are.

NVIDIA already has Volta-based and Ampere-based Tegra chips either available or in development, I don't get that guy's argument at all. I would personally say the Volta based one (Xavier) looks more likely since the Ampere one (Orin) has quad 10GbE and a higher power budget, this new model is aimed a little higher end, plus it's newer and the timeline doesn't fit (sampling mid year, releasing next year) and Nintendo normally goes older. But Xavier looks plausible to me, I don't see why they couldn't use that. It released two years ago, is configurable down to 10W and has tensor cores so it could do DLSS 2.0.

https://en.wikipedia.org/wiki/Tegra#Xavier

Rumors say the AI upscaling may only be in docked mode which means a much higher available TDP. Regardless though, DLSS gives good speedups on current cards (2060 f.ex) even at 1080p resolution, it is an increase in performance-per-watt, and so far the theories about there being some lower bound where DLSS isn't worth it don't seem to actually be substantiated. Maybe lower in quality, yes, but Nintendo is a first party studio and can optimize their games to play nicely with the limitations, and their art style is not particularly high-detail so it's less likely to have glaring artifacts. So far it is only in the context of upscaling to 4K (1080p->4K) which is shown to work fairly well overall, there's enough pixels there.

To be blunt, we already saw NVIDIA do a job posting for engineers to work on DLSS 2.0 in an unspecified console - you're welcome to tell me any other brand's console you think that might be. :) Nothing is confirmed until it's confirmed, but NVIDIA effectively already blew the lid that it's DLSS.

Paul MaudDib fucked around with this message at 23:08 on Mar 4, 2021

Moly B. Denum
Oct 26, 2007

MarcusSA posted:

Gonna quote this from the switch thread.

I would take this quote with a huge grain of salt, since they didn't even bother to look up the current switch hardware specs before confidently declaring something impossible.

LRADIKAL
Jun 10, 2001

Fun Shoe
DLSS aside, it's not CRAZY to think that Nintendo could include a high quality, low latency HDMI upscaler/image processor in the dock. I don't think it's particularly likely, but it could even be backwards compatible with the older version of the switch.

Truga
May 4, 2014
Lipstick Apathy

Moly B. Denum posted:

I would take this quote with a huge grain of salt, since they didn't even bother to look up the current switch hardware specs before confidently declaring something impossible.

i think that's the joke, op

they also insist on "tensor cores" being absolutely required for DLSS, which is patently silly. tensor cores might lighten the perf hit of running DLSS, but they're definitely not required since they don't do anything new, they're just specialized to do a thing better

Q_res
Oct 29, 2005

We're fucking built for this shit!
Notice it says "DLSS 4k TV mode", what are the odds this is something that only works in docked mode and the new dock has some sort of active cooling solution?

shrike82
Jun 11, 2005
Probation
Can't post for 6 hours!
Nintendo might not be big on hardware but I dunno, it could just be Nvidia rolling it into their next SOC by default. It’s clear that they see proprietary AI models over proprietary hardware as a value add - there’s a lot of potential demand for edge devices that can do AI inferencing.

The tensor cores could actually potentially reduce TDP in a mobile setting as they’re typically used to crunch numbers at lower precision. It’s RT that’s the power hog.

Rumors like this tend to be BS but the guy’s refutal isn’t particularly meaningful.

bus hustler
Mar 14, 2019

This did make me realize I've had a switch for almost a year and I've never even taken the dock out of the box.

Hemish
Jan 25, 2005

A good 650W power supply is good enough for a Ryzen 7 5800x, EVGA RTX 3070 FTW3 Ultra and a 4-5 disks (1 nvme, 1-2 SSD, 2 700rpm) , right? Without the video card I would be certain it's good enough but I just got my video card and didn't have time to play with it enough before my PC died again so I'm unsure how much power I'll really pull with both the 3070 and the new Ryzen cpu.

My 850W that I purchased before being able to get a Ampere card (I was planning for a 3080 so I upgraded the PSU but we all know how that went) is making me nervous... I got 2 power outages in 2 months and it killed my motherboard both times. I'm finally getting an UPS but I'm not sure that this 850W is good since I did have power problems in the past 3 years and it never killed my system. Only happened with my new 850W (EVGA 850 GO). The UPS should protect any surges and prevent the shutdown but the idea of keeping this new PSU feels wrong.

So I ordered a new CPU and board as it doesn't look like I can RMA my RMA with Asus so I decided to upgrade by putting the price of a new board for my i7-8700k torwards a Ryzen build.

My old 650W was rock solid so I'm thinking of not taking any chances and revert to that on my new build when I get the parts but after soome Googling, I'm not sure 650W is enough (minimum seems to be 550W, that's close) if I want to try overclocking the 3070 if it turns out to be stable and cool at stock. Any people here with similar builds can comment if my EVGA 650 G3 should be ok? It's nearing on 4 years but I'm pretty strapped for cash after the impulse purchases of the 3070 at scalper price and the now the new parts.

I'm posting here because the factor I'm unsure of is the RTX 3070.

Hemish fucked around with this message at 01:43 on Mar 5, 2021

CoolCab
Apr 17, 2005

glem
EVGA recommends a 650 at least. Eight cores and two physical drives are going to be pretty significant power draw - I dunno. you could undervolt it instead to keep it stable?

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
https://videocardz.com/newz/nvidia-geforce-rtx-3080-ti-to-feature-cryptocurrency-mining-limiter

Rumors of the 3080Ti: it's supposed to have the Ethereum mining nerf, and is supposed to be targeted for an April 2021 release.

To me, this kinda suggests that they might not be releasing a version of existing SKUs with JUST the mining nerf, and instead are going to use the mining nerf to justify releasing refreshes sooner than later and even if the refresh is only marginally better than the card being replaced.

Sagebrush
Feb 26, 2012

LRADIKAL posted:

DLSS aside, it's not CRAZY to think that Nintendo could include a high quality, low latency HDMI upscaler/image processor in the dock. I don't think it's particularly likely, but it could even be backwards compatible with the older version of the switch.

Some sort of upscaler, sure.

You can't have an actual DLSS post-processor that operates via an HDMI passthrough, though, because DLSS isn't just a post-processing step. It needs to be integrated fairly early into the rendering pipeline, and some types of effects and shaders have to be composited in at full resolution afterwards.

shrike82
Jun 11, 2005
Probation
Can't post for 6 hours!
DLSS is just branding and Nvidia already has an AI-based upsampler on their Shield devices that doesn't use temporal info

Adbot
ADBOT LOVES YOU

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Hemish posted:

A good 650W power supply is good enough for a Ryzen 7 5800x, EVGA RTX 3070 FTW3 Ultra and a 4-5 disks (1 nvme, 1-2 SSD, 2 700rpm) , right?

the RTX 3070 FE stock tops out at 225 watts. Gamers Nexus was able to overclock it to draw 247 watts

the Ryzen 7 5800X tops out at 127 watts

that's 374 watts between the two parts - it seems like you'd have more than enough headroom for the rest with a 650 watt PSU

and mind you these power tests are done with FurMark on the GPU and Blender on the CPU which are loads that you probably aren't hitting in a lot of cases

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply