Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Paul MaudDib posted:

why are you booing me? I’m right.

You may or may not be. The reality is that just because Nvidia has the ability to accelerate certain ops at a certain precision doesn't mean that it's necessary. Doing things a particular way simply because their hardware can would not be a first for Nvidia.

That said, simply from a software perspective AMD / MS / whoever else anyone may fantasize about introducing a DLSS equivalent is years behind. It takes time to get things out and integrated, and Nvidia is by far the world leader in realtime graphics in terms of actually getting solutions for things into real action in the market.

Adbot
ADBOT LOVES YOU

VorpalFish
Mar 22, 2007
reasonably awesometm

Rolo posted:

So this shitshow was the first time I tried to get popular tech on release. Is it going to be about the same trying to get a zen3? I’m on a Ryzen 7 3700x so I’m not in a rush but some friends are going to be trying.

I would expect zen3 to be easier at least in part because pricing is way less attractive than the outgoing generation.

jkyuusai
Jun 26, 2008

homegrown man milk

Kazinsal posted:

Yeah, at some point Microsoft is going to just go, "gently caress it, here's the secret sauce" and everyone will have access to it.

Right now the way it's described, DLSS 2.0 is some masterpiece of Nvidia training every single individual game that supports it on petabytes of frame data and somehow condensing that down into ML instructions that can be loaded onto a single GPU's tensor cores. Now, I may be a CPU optimization and software optimization type, but that's pretty far into the "baffling accomplishments" realm.

2.0 uses a generalized model, it doesn't require them to train every game individually. The dev adds support for it by adding in some unknown bits + movement hints for objects in each frame to help the thing make choices.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I accidentally (?) bought a monitor that supports AMD FreeSync even when I wasn't really looking for it (Samsung S27R350FHE for anyone curious). Is that a valuable enough feature for me to go out of my way to get an AMD card (I'm on an RX 580 right now) when I was planning to get an RTX card (whether Turing or Ampere, availability here in the PH is weird) in December?

TerminalSaint
Apr 21, 2007


Where must we go...

we who wander this Wasteland in search of our better selves?

shrike82 posted:

lol dude why do you need to pretend to know ML to win an argument

I'm still not sure what Marxism-Leninism has to do with video cards.

shrike82
Jun 11, 2005

people should realize that Nvidia marketing something as a tensor core exclusive feature doesn't mean that the hardware is actually a prereq.
and for paul's benefit, GEMM (the relevant tensor matrix multiply operation) is hardware accelerated on all video cards (including on AMD cards) - what the tensor cores do is speed up lower precision (fp16, bfloat16, and i believe tf32 with Ampere) GEMM. there's nothing specific to tensor cores that makes DLSS inferencing exclusive to it. funnily, nvidia themselves have a (single frame) AI upscaling solution running 4K60 on their Shield TVs (with a slow Tegra X1+ that doesn't have tensor cores).

it's not clear the frame + movement vectors and previous frame CNN autoencoder approach they've gone with is the only/best way to do game upscaling - just look at how fast Nvidia is iterating on their implementations. given how fast the broader ML field moves, it would not surprise me to see Microsoft or a game developer roll out an implementation of a better approach that some random academic group (or even Microsoft Research) publishes out of nowhere.

and most importantly, it's something Microsoft is keen to implement with their new consoles

shrike82 fucked around with this message at 15:49 on Oct 17, 2020

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:

gradenko_2000 posted:

I accidentally (?) bought a monitor that supports AMD FreeSync even when I wasn't really looking for it (Samsung S27R350FHE for anyone curious). Is that a valuable enough feature for me to go out of my way to get an AMD card (I'm on an RX 580 right now) when I was planning to get an RTX card (whether Turing or Ampere, availability here in the PH is weird) in December?

Hard to say, it's been known that some FreeSync monitors are fine to use with G-Sync even if not certified but since there's only a handful of Samsung displays actually certified compatible in Nvidia's own compatability list, I wouldn't count on it. But any mid to high tier card will certainly go faster than 75hz on your monitor so you're probably stuck with VSync.

Edit: Reread the specs, it's a FreeSync 1 monitor so it actually won't work at all with G-Sync and it's a 1080p screen. Any new card you get now will keep it at the max refresh rate and beyond honestly.

8-bit Miniboss fucked around with this message at 15:47 on Oct 17, 2020

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

8-bit Miniboss posted:

But any mid to high tier card will certainly go faster than 75hz on your monitor so you're probably stuck with VSync.

Any new card you get now will keep it at the max refresh rate and beyond honestly.

okay cool thanks.

jaete
Jun 21, 2009


Nap Ghost

Xachariah posted:

For any UK guys here, don't sleep on Curry's. Just got an email confirming my order was dispatched for a Ventus which was listed on there for £680.00.

I expected it to be a bamboozle or get cancelled due to being a glitch considering the price is the lowest I've seen the RTX 3080 go for in the EU apart from the £650 FE. I was also only looking at the more expensive TUF listings at the time since there was a rumour that Curry's was getting some of those in stock sometime in the last week.

Yeah I noticed Curry's as well, a day or two ago. There was a message on one of the Discords saying Curry's have some 3080s up but also sorry that was actually like 5 min ago we don't have a bot for it yet so didn't notify you all sooner.

I was also running the stupid JavaScript bot in the background, and it was checking Curry's, but it somehow missed it at first as well, it alerted me some 3 minutes after I saw the Discord message. So in total I was maybe 8 minutes late. Had no chance. :v:

I'm wondering if I should just buy a 2070 or something now, rather than wait until the 3070 launch - the way this is going, it wouldn't surprise me if after the 3070 launch all 20x0 cards will get sold out

spunkshui
Oct 5, 2011



jaete posted:

Yeah I noticed Curry's as well, a day or two ago. There was a message on one of the Discords saying Curry's have some 3080s up but also sorry that was actually like 5 min ago we don't have a bot for it yet so didn't notify you all sooner.

I was also running the stupid JavaScript bot in the background, and it was checking Curry's, but it somehow missed it at first as well, it alerted me some 3 minutes after I saw the Discord message. So in total I was maybe 8 minutes late. Had no chance. :v:

I'm wondering if I should just buy a 2070 or something now, rather than wait until the 3070 launch - the way this is going, it wouldn't surprise me if after the 3070 launch all 20x0 cards will get sold out

I’m hoping that AMD has something good and that plus the 3070 takes a lot of pressure off the 3080.

I just don’t think that many people are really looking to spend close to a grand on their graphics card.

Great price/performance $500 cards are going to be popular.

But I don’t know is how hard Samsung is loving this up. Nvidia putting out news articles that say they are going to switch to a different fab next year points to a large amount of loving it up.

spunkshui fucked around with this message at 18:57 on Oct 17, 2020

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Pivo posted:

No, that's not a very good point.

Pixel density at normal viewing distances for 4K is high enough that additional pixels are wasted. Think camera sensors - we used to always want more megapixels when we barely had any, but now that a 60MP full-frame sensor outresolves most lenses, we start seeking higher quality pixels instead. Same with displays ... the pixel density is sufficiently high now that we are chasing high-framerate, low persistence, higher contrast ratio, higher peak brightness -- better pixels -- rather than more of them.

Resolution is quantifiable, 'higher quality pixels' are not. Go 'not a good point' at me again with more microsoft pr buzzspeak, paul mauddib needs the competition for hot takes

Sagebrush
Feb 26, 2012

i think it is obvious that by "higher quality pixels" pivo meant "other characteristics of the image beyond spatial resolution." For instance the pixels' refresh rate, their color gamut and accuracy, their brightness, their black points.

Same as how you don't need more than about 24mp in your camera for the vast majority of uses, so instead of just cramming in more photosites, you can make the ones you have larger with closer spacing so they capture more light, and design them to have a broader range of exposure before clipping, etc and your images will be subjectively better.

Xarn
Jun 26, 2015

Sagebrush posted:

Two people who both managed to download and compile TensorFlow slapping the poo poo out of each other over which one is the true genius AI expert :allears:

Hey now, compiling TF is not easy, because lol Google.

shrike82
Jun 11, 2005

joke's on you i use pytorch

hobbesmaster
Jan 28, 2008

AirRaid posted:

"Tensor" is literally a trademark name that nvidia came up with for a thing, it is not a type of thing in itself.

If this was true I might’ve gotten better than a C in engineering electromagnetics.

MonkeyLibFront
Feb 26, 2003
Where's the cake?

jaete posted:

Yeah I noticed Curry's as well, a day or two ago. There was a message on one of the Discords saying Curry's have some 3080s up but also sorry that was actually like 5 min ago we don't have a bot for it yet so didn't notify you all sooner.

I was also running the stupid JavaScript bot in the background, and it was checking Curry's, but it somehow missed it at first as well, it alerted me some 3 minutes after I saw the Discord message. So in total I was maybe 8 minutes late. Had no chance. :v:

I'm wondering if I should just buy a 2070 or something now, rather than wait until the 3070 launch - the way this is going, it wouldn't surprise me if after the 3070 launch all 20x0 cards will get sold out

I've seen that Currys has both the gigabyte Aurus Master and Vision dropping on the 5th of next month, maybe wait till then for nothing to be available.

MonkeyLibFront fucked around with this message at 18:46 on Oct 17, 2020

mediaphage
Mar 22, 2007

Excuse me, pardon me, sheer perfection coming through

shrike82 posted:

people should realize that Nvidia marketing something as a tensor core exclusive feature doesn't mean that the hardware is actually a prereq.
and for paul's benefit, GEMM (the relevant tensor matrix multiply operation) is hardware accelerated on all video cards (including on AMD cards) - what the tensor cores do is speed up lower precision (fp16, bfloat16, and i believe tf32 with Ampere) GEMM. there's nothing specific to tensor cores that makes DLSS inferencing exclusive to it. funnily, nvidia themselves have a (single frame) AI upscaling solution running 4K60 on their Shield TVs (with a slow Tegra X1+ that doesn't have tensor cores).

it's not clear the frame + movement vectors and previous frame CNN autoencoder approach they've gone with is the only/best way to do game upscaling - just look at how fast Nvidia is iterating on their implementations. given how fast the broader ML field moves, it would not surprise me to see Microsoft or a game developer roll out an implementation of a better approach that some random academic group (or even Microsoft Research) publishes out of nowhere.

and most importantly, it's something Microsoft is keen to implement with their new consoles


i meant to post this chart earlier and got off doing something else, so thanks for posting. i asked my so about the relative approaches this morning. not a gfx person, but an ml prof. he said this idea isn’t surprising as people refine techniques and realize they can accelerate more fundamental maths, and they can do that by just reusing pre-existing shader cores. assuming microsoft worked with them to implement this in hardware, i wonder if the ps5 has it, or if their desktops will have it / the appropriate variant.

my guess is that whatever version of hardware dlss amd comes up with, it’ll probably be behind a bit at launch and the internet will wail and gnash teeth and buy nvidia, and then a few driver releases in or whatever it starts to improve.

i guess maybe one question is whether amd wants to put in the same kind of model training time on it per title or come at it from a more universal, if less individually tweaked, set of rules.

mediaphage fucked around with this message at 17:07 on Oct 18, 2020

Rubellavator
Aug 16, 2007

just checking back in and wow guys this thread is looking a little tensor than when i left

Stickman
Feb 1, 2004

shrike82 posted:

it's not clear the frame + movement vectors and previous frame CNN autoencoder approach they've gone with is the only/best way to do game upscaling - just look at how fast Nvidia is iterating on their implementations. given how fast the broader ML field moves, it would not surprise me to see Microsoft or a game developer roll out an implementation of a better approach that some random academic group (or even Microsoft Research) publishes out of nowhere.

You're probably right here, but there are very real limits to what you can do with frame only. You have limited information for upscaling which means you need to make very strong assumptions if you're trying to get close to native quality. DLSS 1.0 tried to use ML to optimize those assumptions on a per-game basis and it performed comparatively poorly for the huge amount of work it required. There might be some other information that can be brought into the upscaling algorithm, but I suspect that any good solution in the future is going to incorporate both the frame plus some amount of extra scene information.

Stickman fucked around with this message at 19:27 on Oct 17, 2020

Xachariah
Jul 26, 2004

jaete posted:

Yeah I noticed Curry's as well, a day or two ago. There was a message on one of the Discords saying Curry's have some 3080s up but also sorry that was actually like 5 min ago we don't have a bot for it yet so didn't notify you all sooner.

I was also running the stupid JavaScript bot in the background, and it was checking Curry's, but it somehow missed it at first as well, it alerted me some 3 minutes after I saw the Discord message. So in total I was maybe 8 minutes late. Had no chance. :v:

I'm wondering if I should just buy a 2070 or something now, rather than wait until the 3070 launch - the way this is going, it wouldn't surprise me if after the 3070 launch all 20x0 cards will get sold out

poo poo man, sounds like you actually saw the second wave there. The Ventus and Trio X went up initially, went out of stock, then they were back into stock 5 mins later briefly.

I've heard of a similar kind of outcome on Amazon UK. There's the initial stock and then a staccato restock shorty after. I think these places scan in stock one by one and some people have been successful just continuing to squat on the page and F5 after they "miss" the initial drop.

Theophany
Jul 22, 2014

SUCCHIAMI IL MIO CAZZO DA DIETRO, RANA RAGAZZO



2022 FIA Formula 1 WDC
One month since paying for a card as yet unreceived.

This feels like a bad consumer experience. Like, it's not the end of the world, but it's real bad.

loving demand issues :argh:

Pivo
Aug 20, 2004


repiv posted:

the usefulness of an AI upscaler is predicated on how fast it is, and nvidia has an edge there with their specialized hardware units

AMD not only needs to catch up with the DLSS implementation but also surpass its performance to keep the runtime in the ~1ms range on their hardware

Dedicated inference hardware is a solved problem though, isn't it? Just spend some transistors on matrix multiply-add, which is all a tensor core is. Apple, Qualcomm, Intel, NVIDIA's figured it out ...

Animal
Apr 8, 2003

Paul MaudDib posted:

why are you booing me? I’m right.

Right as in correct, or right as in “microsoft flight simulator 2020 is just a glorified tech demo”?

steckles
Jan 14, 2006

shrike82 posted:

it's not clear the frame + movement vectors and previous frame CNN autoencoder approach they've gone with is the only/best way to do game upscaling - just look at how fast Nvidia is iterating on their implementations. given how fast the broader ML field moves, it would not surprise me to see Microsoft or a game developer roll out an implementation of a better approach that some random academic group (or even Microsoft Research) publishes out of nowhere.
I'm reminded of the introduction of Morphological Antialiasing during the PS3/360 era. A new technique requiring the awesome power of the Cell processor comes along, things seems dire for other platforms as the method doesn't work well without SPUs, and the image quality gain is too large to ignore. I remember being blown away seeing God of War III for the first time. It was a bigger leap in graphical fidelity at the time than going from 1440p to 4K is today in my opinion. After a while, cheaper algorithms came along that maybe didn't look quite as good but they work everywhere and so become the default everywhere.

I've got some experience with neural networks and machine learning algorithms. Enough to implement a digit classifier with MNIST from scratch in C++ anyway. I'm also familiar with reconstruction algorithms for computer graphics as writing ray tracers has been a dorky hobby of mine for the last 20 years. The work NVIDIA's been doing on neural denoising is extremely impressive and could be a total game changer for stuff like the architectural visualisation field.

DLSS is also extremely impressive but I don't think it's self evident that they've hit the best spot on the quality/computation curve or that their algorithm universally applicable. Nobody outside of NVIDIA knows exactly what they're doing, but part of their algorithm we do know about is an auto-encoder. Autoencoders are, put very simply, networks that accepts a bunch on inputs, contains a "choke point" in the middle that forces the inputs to be mapped to a much smaller representation, and an expander that attempts to reconstruct the original inputs from the compressed middle layers. These are great at denoising, inpainting missing details, or if you've got a larger number out outputs than inputs, scaling. Running them is just a bunch of matrix multiplications, which is what the Tensor Cores are designed to do quickly. They usually produce crap results when the inputs are too far outside their training set though and you can't train on everything.

To engage in a bit so semi-informed baseless speculation, I think part of the reason DLSS still isn't widely available and has shown up in so few shipped titles is that it requires a degree of per-title or even per-scene tuning in it's current state to produce decent results. Not retraining the underlying network, but massaging and filtering it's inputs. It's in UE4 now, but NVIDIA does approval on a per-developer basis before they let you turn it on. Throwing the doors open and letting everybody play with it will mean that it's limits and weak points are quickly found. NVIDIA has determined, correctly in my opinion, that a few titles that implement DLSS really well combined with the promise of more to come is going to sell more cards than those same few games and a bunch of awful indy crap that uses DLSS badly.

I guess what I'm getting at is that DLSS doesn't seem, to me, to be a general solution to the problem of game upscaling and there are almost certainly cheaper, non neural network based algorithms that will do well enough. Now that they've opened that door, I think we're going to see a lot of rapid progress in the field, and I can't wait to see where things are in a couple years.

Pivo
Aug 20, 2004


Zedsdeadbaby posted:

Resolution is quantifiable, 'higher quality pixels' are not. Go 'not a good point' at me again with more microsoft pr buzzspeak, paul mauddib needs the competition for hot takes

You're a moron. Higher quality pixels literally does have meaning. It means quality > quantity in terms of the content of the raster vs. the size of the raster. It has meaning in displays, it has meaning in sensors, it has meaning in rendering. I can't believe this has to be explained. It can be quantified in various ways - shading precision, mesh complexity, raytracing bounce depth, whatever.

mediaphage posted:

i guess maybe one question is whether amd wants to put in the same kind of model training time on it per title or come at it from a more universal, if less individually tweaked, set of rules.

DLSS 2.0 is a general solution and doesn't require per-title training anymore, actually!

Pivo fucked around with this message at 21:16 on Oct 17, 2020

Cygni
Nov 12, 2005

raring to post

Nvidia release the cards the thread is barely holding together.

Fair Hallion
Jul 25, 2007

:toot: :toot: :toot: :toot:
Is this the right place to ask for help... if not (sorry) let me know where best to move it to.


I need some advice with GPU & power requirements, I'm clueless on power so go ahead and call me an idiot if need be. (I'm in UK, if it matters)

My teenage son who has a not -particularly-impressive PC, has went and blown his savings on a new GPU without properly checking if he can use it.


This is the card: Gigabyte GeForce RTX 2060 OC https://www.amazon.co.uk/gp/product/B07MJGCPW5

it has an 8-pin power connection; his current GPU only uses power from the PCI slot.

His 500w PSU doesn't have this 8pin power connector; it has two SATA and two molex, but the HDD is using one molex.

Do we have any option here but to swap out to a new PSU which has this 8 pin connector?

I've heard of adapters, eg which connect two molex to this 8pin, but then what does his HDD use? Can I use those two SATA power connectors for anything here?

As I say, I'm clueless on voltages / wattage / power draw etc.

Also I've heard you should just avoid adapters, unless you like occasional fires.

Any advice much appreciated!

other specs if they matter:
MSI MS-7788 H61M-P31 (G3) LGA 1155 M-ATX Motherboard
Intel Core i7-2600 3.4 GHz Quad-Core Processor
16gb DDR3 RAM
1TB HDD

spunkshui
Oct 5, 2011



Fair Hallion posted:

Is this the right place to ask for help... if not (sorry) let me know where best to move it to.


I need some advice with GPU & power requirements, I'm clueless on power so go ahead and call me an idiot if need be. (I'm in UK, if it matters)

My teenage son who has a not -particularly-impressive PC, has went and blown his savings on a new GPU without properly checking if he can use it.


This is the card: Gigabyte GeForce RTX 2060 OC https://www.amazon.co.uk/gp/product/B07MJGCPW5

it has an 8-pin power connection; his current GPU only uses power from the PCI slot.

His 500w PSU doesn't have this 8pin power connector; it has two SATA and two molex, but the HDD is using one molex.

Do we have any option here but to swap out to a new PSU which has this 8 pin connector?

I've heard of adapters, eg which connect two molex to this 8pin, but then what does his HDD use? Can I use those two SATA power connectors for anything here?

As I say, I'm clueless on voltages / wattage / power draw etc.

Also I've heard you should just avoid adapters, unless you like occasional fires.

Any advice much appreciated!

other specs if they matter:
MSI MS-7788 H61M-P31 (G3) LGA 1155 M-ATX Motherboard
Intel Core i7-2600 3.4 GHz Quad-Core Processor
16gb DDR3 RAM
1TB HDD

You can convert a single sata plug to an 8pin. I think 500 might work.

i7 2600 is a cpu from 2011, depending on the game its going to hold the gpu back.

It will absolutely run games much faster then it currently does at least.

If that machine doesn’t have a SSD for its OS its a good investment to make. It can turn into a game storage drive in an other computer down the road.

spunkshui fucked around with this message at 22:00 on Oct 17, 2020

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Fair Hallion posted:

Unfortunate situation

Is the PSU a model that we can look up? Does it have any modular cables or are all the cables just coming out of one big loom?

You could adapt one of the SATA power cables into molex for the HDD, which would then let you adapt the 2 x molex from the PSU into a PCI-E 8-pin.

Personally, I would never do this because I don't trust those adapters and have budget for a suitable PSU, but it would probably work.

VorpalFish
Mar 22, 2007
reasonably awesometm

Fair Hallion posted:

Is this the right place to ask for help... if not (sorry) let me know where best to move it to.


I need some advice with GPU & power requirements, I'm clueless on power so go ahead and call me an idiot if need be. (I'm in UK, if it matters)

My teenage son who has a not -particularly-impressive PC, has went and blown his savings on a new GPU without properly checking if he can use it.


This is the card: Gigabyte GeForce RTX 2060 OC https://www.amazon.co.uk/gp/product/B07MJGCPW5

it has an 8-pin power connection; his current GPU only uses power from the PCI slot.

His 500w PSU doesn't have this 8pin power connector; it has two SATA and two molex, but the HDD is using one molex.

Do we have any option here but to swap out to a new PSU which has this 8 pin connector?

I've heard of adapters, eg which connect two molex to this 8pin, but then what does his HDD use? Can I use those two SATA power connectors for anything here?

As I say, I'm clueless on voltages / wattage / power draw etc.

Also I've heard you should just avoid adapters, unless you like occasional fires.

Any advice much appreciated!

other specs if they matter:
MSI MS-7788 H61M-P31 (G3) LGA 1155 M-ATX Motherboard
Intel Core i7-2600 3.4 GHz Quad-Core Processor
16gb DDR3 RAM
1TB HDD

A 500w psu should be fine to run a 2060, but I'd very suspect of the quality of a 500w unit that doesn't have a pcie connector.

I might consider getting a new one, especially if it's as old as that CPU - psus don't last forever.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Kraftwerk posted:

Well everyone knew there’s no way I’m waiting till Sunday for this.
I’m free at last! I didn’t even need to F5 for this.


I’m very happy about the ending to the Kraftwerk 3080 Saga.

v1ld
Apr 16, 2012

steckles posted:

To engage in a bit so semi-informed baseless speculation, I think part of the reason DLSS still isn't widely available and has shown up in so few shipped titles is that it requires a degree of per-title or even per-scene tuning in it's current state to produce decent results. Not retraining the underlying network, but massaging and filtering it's inputs. It's in UE4 now, but NVIDIA does approval on a per-developer basis before they let you turn it on. Throwing the doors open and letting everybody play with it will mean that it's limits and weak points are quickly found. NVIDIA has determined, correctly in my opinion, that a few titles that implement DLSS really well combined with the promise of more to come is going to sell more cards than those same few games and a bunch of awful indy crap that uses DLSS badly.

So having the extra data available from a good TAA implementation may not be sufficient to have a good DLSS implementation in the same game? Interesting if so and may explain the gap between promise and reality right now. Thanks for the post, good read along with the one you responded to.

When it comes to why DLSS adoption is slow, there's also the fact that current games have perfectly good presentation without DLSS because they were designed to have good presentation without DLSS, RTX aside. Studios may simply not be willing to put in extra work on a game that doesn't require the DLSS perf boost when they've already put in bunch of work to have reasonable performance at current resolutions.

Games that are designed to use DLSS-like, reconstructive/scale up boosts from the ground up so they can go further on the quality per-pixel as so well put earlier will undoubtedly show up once the tech shakes out some more - which will certainly happen when the consoles drop and their approaches are known regardless of what happens on PCs.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

AirRaid posted:

"Tensor" is literally a trademark name that nvidia came up with for a thing, it is not a type of thing in itself.

Whoever told you this is not your friend.

Fair Hallion
Jul 25, 2007

:toot: :toot: :toot: :toot:

Riflen posted:

Is the PSU a model that we can look up? Does it have any modular cables or are all the cables just coming out of one big loom?

this seems to be it - pretty basic I admit. All wires come out of the same hole in the PSU's casing
https://www.ebay.co.uk/itm/ATX-500-B-500W-PC-Power-Supply-Black-With-24-Pin-3-x-SATA-12CM-Silent-Fan/192162044644


Riflen posted:

You could adapt one of the SATA power cables into molex for the HDD, which would then let you adapt the 2 x molex from the PSU into a PCI-E 8-pin.

would the SATA provide the same power to the HDD as the molex, is it just the connector that's different? (SATA wires look thinner that the molex ones)

thanks

Fair Hallion
Jul 25, 2007

:toot: :toot: :toot: :toot:

VorpalFish posted:

A 500w psu should be fine to run a 2060, but I'd very suspect of the quality of a 500w unit that doesn't have a pcie connector.

I might consider getting a new one, especially if it's as old as that CPU - psus don't last forever.

Thanks, the PC isn't that old though, built from new parts 18 months ago or so, but just not latest or top of the range parts. So it's not an ancient PSU, just on the basic side.

Geemer
Nov 4, 2010



Animal posted:

Right as in correct, or right as in “microsoft flight simulator 2020 is just a glorified tech demo”?

"Not left."

Stickman
Feb 1, 2004

Fair Hallion posted:

this seems to be it - pretty basic I admit. All wires come out of the same hole in the PSU's casing
https://www.ebay.co.uk/itm/ATX-500-B-500W-PC-Power-Supply-Black-With-24-Pin-3-x-SATA-12CM-Silent-Fan/192162044644

Several of the reviews indicate that people are having problems trying to power discrete cards with that supply. That's the very, very bottom of the barrel, and I'd be hesitant to trust it with anything outside a cheap office computer :(

Unfortunately, it's a pretty terrible time to be buying PSUs. Corsair CX/CXM are usually about the cheapest psus I'd recommend for a gaming PC, and even those pretty expensive right now. It looks like you could get a better quality Corsair TXM for cheaper than CX right now, though. It's still spendy, but it's a good investment to protect your components and comes with a 7-year warranty so he could easily carry it forward for quite a while.

repiv
Aug 13, 2009

Pivo posted:

Dedicated inference hardware is a solved problem though, isn't it? Just spend some transistors on matrix multiply-add, which is all a tensor core is. Apple, Qualcomm, Intel, NVIDIA's figured it out ...

sure they can get there eventually if they need to, we're mostly talking about RDNA2 for the next gen consoles and desktop cards though

if it turns out they can't make a viable DLSS clone without that kind of acceleration in their consumer cards it will set them back 2 or 4 years

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Fair Hallion posted:

this seems to be it - pretty basic I admit. All wires come out of the same hole in the PSU's casing
https://www.ebay.co.uk/itm/ATX-500-B-500W-PC-Power-Supply-Black-With-24-Pin-3-x-SATA-12CM-Silent-Fan/192162044644


would the SATA provide the same power to the HDD as the molex, is it just the connector that's different? (SATA wires look thinner that the molex ones)

thanks

SATA is designed for Hard Disks and Optical drives. It'll be fine.

I'm kind of surprised the HDD doesn't have a SATA power connector. Does it have a SATA data connector and a Molex power connector? I haven't seen one of those for 15 years.

Adbot
ADBOT LOVES YOU

Fair Hallion
Jul 25, 2007

:toot: :toot: :toot: :toot:

Stickman posted:

Several of the reviews indicate that people are having problems trying to power discrete cards with that supply. That's the very, very bottom of the barrel, and I'd be hesitant to trust it with anything outside a cheap office computer :(

Unfortunately, it's a pretty terrible time to be buying PSUs. Corsair CX/CXM are usually about the cheapest psus I'd recommend for a gaming PC, and even those pretty expensive right now. It looks like you could get a better quality Corsair TXM for cheaper than CX right now, though. It's still spendy, but it's a good investment to protect your components and comes with a 7-year warranty so he could easily carry it forward for quite a while.

Fair enough. I think a new PSU is the route we'll have to take. Thanks for the help!

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply