Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Fuzz
Jun 2, 2003

Avatar brought to you by the TG Sanity fund

K8.0 posted:

GPUs for the past 3 and a half years since the 1080Ti was released.

Still the 3080 is pretty loving impressive, even though yeah, I suspect this mid-gen refresh will be a bigger step up than the Turing Supers were.


Yeah this is an odd situation where we are going from the PS4/Bone which were arguably the weakest console generation ever released relative to their contemporary PCs, with the average gamer already having a faster PC when the drat things were new, to a generation that is actually going to be a good twice as powerful as the average current gaming PC in terms of GPU, CPU, memory, and storage. I do think it will take longer than you expect because I expect at least the next year to be almost entirely PS4/Bone ports, but after that you are going to need a proper modern PC for a good number of games, which hasn't been the case for like 15 years.

Yeah, this is why I'm thinking by 2022, this new gen will be hitting its stride with multi-process games with RT that are streaming assets directly from the SSD to the VRAM and rendering your open world on the fly. At that point basically 90% of the PCs currently on the market are going to poo poo and run like garbage or not at all.

Adbot
ADBOT LOVES YOU

wolrah
May 8, 2006
what?
Welp, my 2060 stand-in plan fell apart when I went to install it and saw that it only had three video outputs.

I wasn't expecting it to match the five my 970 has, but I figured it'd have three DP + one HDMI.

Unfortunately I have three monitors plus a VR headset and I'm not really interested in swapping cables around every time I want to play VR.

Back to Microcenter I go tomorrow I guess....

shrike82
Jun 11, 2005

heh, just get a Stadia account to play CP2077

MarcusSA
Sep 23, 2007

shrike82 posted:

heh, just get a Stadia account to play CP2077

Actually GFN.

hobbesmaster
Jan 28, 2008

repiv posted:

i wish apex had DLSS, i don't need more performance but their TAA sucks rear end

for some reason it takes forever to converge even at >150fps

Honestly it’s kind of amazing apex works period since it’s on the source engine.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I only really expect something DLSS like to take off when Microsoft drops their own implementation based on DirectML. Because then you're not tied to an NVidia developer contract.

Jeff Fatwood
Jun 17, 2013

Taima posted:

This "incentivize devs not to use DLSS" thing is pathetic and AMD should feel bad.

Or maybe and hear me out here... just maybe Nvidia should open up some of their "revolutionary" tech once in a while if they want it to actually start getting it implemented.

Kazinsal
Dec 13, 2011



Combat Pretzel posted:

I only really expect something DLSS like to take off when Microsoft drops their own implementation based on DirectML. Because then you're not tied to an NVidia developer contract.

Yeah, at some point Microsoft is going to just go, "gently caress it, here's the secret sauce" and everyone will have access to it.

Right now the way it's described, DLSS 2.0 is some masterpiece of Nvidia training every single individual game that supports it on petabytes of frame data and somehow condensing that down into ML instructions that can be loaded onto a single GPU's tensor cores. Now, I may be a CPU optimization and software optimization type, but that's pretty far into the "baffling accomplishments" realm.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Jeff Fatwood posted:

Or maybe and hear me out here... just maybe Nvidia should open up some of their "revolutionary" tech once in a while if they want it to actually start getting it implemented.

I’m kinda with you in theory but DLSS is a multi generational, extremely expensive and risky power play that not only took years to pull off, it was a pretty big black eye on Nvidia for most of Turing until it got good very recently.

It’s also one of the only (the only?) vendor specific add-in technologies that has ever been truly revolutionary.

They can’t just OS something like that. It’s not realistic. Possibly they could license it, but that gets thorny especially when AMD is literally just paying people not to use it.

DLSS doesn’t have to be in every game, it only needs to be in as many AAA games as they can reasonably onboard, and it’s only going to get easier to implement. There haven’t even been that many new AAA titles on pc since DLSS2 was released.

I may eat crow on this one but if CP2077 is good, it should be a total breakout moment for DLSS2. This is the first real debut of the tech in a brand new year-defining title. Reviewers are going to have to say “hey so if you don’t have DLSS you are leaving like 50% performance on the table sorry” like... gently caress ray tracing in comparison. that’s the real prize.

poo poo, it’s barely even an add in tech a la hairworks, it’s the future of gaming period except only one vendor has it for now.

Nvidia has the money and connections to get DLSS in a lot of the games that matter, and they surely will. I wish we could all sing kumbaya and open source everything ever but that’s just not in the cards. The best AMD can hope to do is shoehorn in some kind of checkerboard rendering that sucks in comparison but still produces a large gain in framerate, so at least then they’re not getting blown out every time a new DLSS benchmark releases.

Hopefully down the line when there is a competitor to DLSS we can open source the whole thing a la Gsync compatible, put it into every game and forget about it.

In the meantime if Nvidia can get DLSS into, say, 5-10 AAA titles per year, the tech is so good that it’s more than justified itself.

Xachariah
Jul 26, 2004

For any UK guys here, don't sleep on Curry's. Just got an email confirming my order was dispatched for a Ventus which was listed on there for £680.00.

I expected it to be a bamboozle or get cancelled due to being a glitch considering the price is the lowest I've seen the RTX 3080 go for in the EU apart from the £650 FE. I was also only looking at the more expensive TUF listings at the time since there was a rumour that Curry's was getting some of those in stock sometime in the last week.

shrike82
Jun 11, 2005

Microsoft mentioned hardware-accelerated ML inferencing for resolution scaling in their Xbox hot chip talk.
I wouldn't be surprised if they develop a console-facing DLSS that starts out inferior but overtakes Nvidia's implementation just because it's made accessibile for Xbox game development.

Carecat
Apr 27, 2004

Buglord
Uh aren't all these RTX discussions missing the point? The consoles are going to be driving raytracing with Microsoft/Sony eager to get developers to add a new feature and publishers eager for new franchise gimmicks.

fuf
Sep 12, 2004

haha

Xachariah posted:

For any UK guys here, don't sleep on Curry's. Just got an email confirming my order was dispatched for a Ventus which was listed on there for £680.00.

I expected it to be a bamboozle or get cancelled due to being a glitch considering the price is the lowest I've seen the RTX 3080 go for in the EU apart from the £650 FE. I was also only looking at the more expensive TUF listings at the time since there was a rumour that Curry's was getting some of those in stock sometime in the last week.

When did you order it?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

Microsoft mentioned hardware-accelerated ML inferencing for resolution scaling in their Xbox hot chip talk.
I wouldn't be surprised if they develop a console-facing DLSS that starts out inferior but overtakes Nvidia's implementation just because it's made accessibile for Xbox game development.

Without hardware acceleration they probably will end up with a DLSS 1.9-esque implementation running on a simplified model with worse quality. To be fair maybe without the intention to move to a tensor based solution maybe they will be able to come up with a DLSS 1.9 with less of a quality hit relative to the full model, NVIDIA may not have explored that space as thoroughly as they might have if they didn’t know it was an intermediate step that was going to be quickly discarded in favor of hardware acceleration. And neural net pruning is certainly an area of active research, but there are limits to that as well.

That is unfortunately going to be something that reviewers are going to have to come to grips with - there is nothing stopping competitors from running at a lower internal resolutions or on a simplified model with lower visual quality and saying “aha, we’re faster!”. Reviewers are going to have to assess the quality of the output to a much greater extent than they have for the last 15 years or so (since the days of the bit depth reduction scandals). Up until Turing came out, pixels were more or less pixels, and nobody reviewed with upscaling turned on.

Paul MaudDib fucked around with this message at 10:20 on Oct 17, 2020

shrike82
Jun 11, 2005

dude stop saying tensor based solution when you clearly don't know what a tensor is.
and they literally said "hardware accelerated ML inferencing" in their talk

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Kazinsal posted:

Right now the way it's described, DLSS 2.0 is some masterpiece of Nvidia training every single individual game that supports it on petabytes of frame data and somehow condensing that down into ML instructions that can be loaded onto a single GPU's tensor cores. Now, I may be a CPU optimization and software optimization type, but that's pretty far into the "baffling accomplishments" realm.
Neural networks and machine learning is pretty baffling stuff.

Pivo
Aug 20, 2004


I don't think it's reasonable to assume that kind of AI research can only happen at NVIDIA though. If AMD or the broader industry wanted their own temporally-stable up-scaling deep learning model targeting hardware accelerated inference, I'm sure they could do it -- NVIDIA's advantage is in the head start, IMO.

And how long in real terms will DLSS remain an advantage? Historically resolution has steadily increased over time, but with 4K we're at the point of severe diminishing returns going further. In a few generations of PC hardware when 4K native is like 1440p native is today, will DLSS still be such a huge advantage?

It is pretty cool, but it feels like a stopgap.

Xachariah
Jul 26, 2004

fuf posted:

When did you order it?

Ordered it yesterday at 1.15pm. Was only up for a minute or so.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Kraftwerk posted:

Well everyone knew there’s no way I’m waiting till Sunday for this.
I’m free at last! I didn’t even need to F5 for this.


fantastic implementation of hairworks!

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Pivo posted:


And how long in real terms will DLSS remain an advantage? Historically resolution has steadily increased over time, but with 4K we're at the point of severe diminishing returns going further. In a few generations of PC hardware when 4K native is like 1440p native is today, will DLSS still be such a huge advantage?

It is pretty cool, but it feels like a stopgap.

That's not a very good point, resolution increases will always be a goal no matter what. I've been hearing people say 'we don't need higher resolutions' since the days of 320x240 Quake gaming.

Pivo
Aug 20, 2004


Zedsdeadbaby posted:

That's not a very good point, resolution increases will always be a goal no matter what. I've been hearing people say 'we don't need higher resolutions' since the days of 320x240 Quake gaming.

No, that's not a very good point.

Pixel density at normal viewing distances for 4K is high enough that additional pixels are wasted. Think camera sensors - we used to always want more megapixels when we barely had any, but now that a 60MP full-frame sensor outresolves most lenses, we start seeking higher quality pixels instead. Same with displays ... the pixel density is sufficiently high now that we are chasing high-framerate, low persistence, higher contrast ratio, higher peak brightness -- better pixels -- rather than more of them.

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.

wolrah posted:

Welp, my 2060 stand-in plan fell apart when I went to install it and saw that it only had three video outputs.

I wasn't expecting it to match the five my 970 has, but I figured it'd have three DP + one HDMI.

I'm in a similar situation trying to find a replacement for my 4-port 970. Most suitable option is a Gigabyte 2060 non-Super on sale for 309€, but I wouldn't get step-up. EVGA 1660 could be suitable, but lacking ports. EVGA 2060s are too expensive from the finnish dealer and I'm not sure how it would work if I bought one from Amazon.de. I assume EVGA would allow that, but would the step-up price be based on finnish or german pricing. And the biggest question is 3060 release time, I wouldn't want to purchase too early and be forced to step-up to 3070. We don't even know how much EVGA 3070s will cost in Finland, cheapest 3080 is 830€.

At least I just found a configuration to make the integrated Intel to work with my current card, even if it's a bit janky, so that's plan B. Plan C is to keep my 970 as a secondary card, and that might just help with the multi monitor power saving.

repiv
Aug 13, 2009

Pivo posted:

I don't think it's reasonable to assume that kind of AI research can only happen at NVIDIA though. If AMD or the broader industry wanted their own temporally-stable up-scaling deep learning model targeting hardware accelerated inference, I'm sure they could do it -- NVIDIA's advantage is in the head start, IMO.

the usefulness of an AI upscaler is predicated on how fast it is, and nvidia has an edge there with their specialized hardware units

AMD not only needs to catch up with the DLSS implementation but also surpass its performance to keep the runtime in the ~1ms range on their hardware

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

dude stop saying tensor based solution when you clearly don't know what a tensor is.
and they literally said "hardware accelerated ML inferencing" in their talk

if you don’t understand what tensor units do and how they accelerate DLSS there’s not much point to you opining about how a DLSS substitute might play out.

RDNA2 not having hardware accelerated tensor ops will critically limit its performance on “big” nets like the ones DLSS 2.0 uses, the best it will be able to do is a smaller pruned net like DLSS 1.9 that produces a lower-quality approximation.

As an ostensible ML researcher I would have thought you’d understand what tensor units do in processing ML nets.

Paul MaudDib fucked around with this message at 13:40 on Oct 17, 2020

shrike82
Jun 11, 2005

lol dude just google what a tensor is

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


Tensor cores, tenser dorks.

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
"Tensor" is literally a trademark name that nvidia came up with for a thing, it is not a type of thing in itself.

You might as well be saying "well if AMD doesn't have CUDA cores then their cards will be useless".

shrike82
Jun 11, 2005

tensors are math objects...

Truga
May 4, 2014
Lipstick Apathy
just stop replying to paul lol, he's an even bigger case of dunning kruger than me

SCheeseman
Apr 23, 2003

Things are getting tensor

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness
Amazon sent me an email update telling me that they'll send me an email when they have a delivery date for me, cool!

Truga
May 4, 2014
Lipstick Apathy

Pivo posted:

No, that's not a very good point.

Pixel density at normal viewing distances for 4K is high enough that additional pixels are wasted. Think camera sensors - we used to always want more megapixels when we barely had any, but now that a 60MP full-frame sensor outresolves most lenses, we start seeking higher quality pixels instead. Same with displays ... the pixel density is sufficiently high now that we are chasing high-framerate, low persistence, higher contrast ratio, higher peak brightness -- better pixels -- rather than more of them.

this is only true to a point. as refresh rates and especially contrast improve, so does the need for resolution. resolution of an average eye at normal monitor distances is between 200 and 300 dpi (30" 8k is at the top range of that at 290dpi for example), but at high contrasts you can still spot much smaller anomalies than that (say, white pixel on a black screen) super easily.

there's a paper floating somewhere on the internet about VR research, and the ideal minimum we'd want for VR to feel 100% natural is ~1000hz, and that requires batshit insane resolutions to avoid artifacts when moving your head, and antialias tricks can only do so much.

e: a super easy way of checking for this is how non-antialiased printed text only starts to look good above ~800 dpi, because a black on white print under sunlight will have 10 thousand times higher contrast than your average screen.

Incessant Excess posted:

Amazon sent me an email update telling me that they'll send me an email when they have a delivery date for me, cool!

as someone waiting this out, this is the best drat launch ever lmao

Truga fucked around with this message at 13:56 on Oct 17, 2020

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

AirRaid posted:

"Tensor" is literally a trademark name that nvidia came up with for a thing, it is not a type of thing in itself.

You might as well be saying "well if AMD doesn't have CUDA cores then their cards will be useless".

“tensor” refers to the matrices that TensorFlow works on and is not an NVIDIA specific term, and you’ll note that I didn’t say “AMD needs CUDA cores”, I said they needed hardware acceleration for DLSS, they need their own implementation of tensor units, whatever they end up calling them. Otherwise performance just isn’t going to be good enough for a “big” net like DLSS 2.0, and AMD will probably be limited to a pruned “approximation” net that produces a lower-quality output like DLSS 1.9 uses.

shrike is being smarmy as usual and doesn’t understand what tensor units even do in the whole DLSS process and why not having hardware acceleration for tensor ops is probably a limiting factor for AMD, and you’re being a pedant about something I didn’t even say about naming.

Here, take another read since you didn’t catch it the first time.

Paul MaudDib posted:

Without hardware acceleration they probably will end up with a DLSS 1.9-esque implementation running on a simplified model with worse quality. To be fair maybe without the intention to move to a tensor based solution maybe they will be able to come up with a DLSS 1.9 with less of a quality hit relative to the full model, NVIDIA may not have explored that space as thoroughly as they might have if they didn’t know it was an intermediate step that was going to be quickly discarded in favor of hardware acceleration. And neural net pruning is certainly an area of active research, but there are limits to that as well.

Anyway NVIDIA didn’t come up with the term tensor anyway, it refers to the matrices used in ML. The predominant library in the area is called “TensorFlow” and that was named by Google not NVIDIA.

Paul MaudDib fucked around with this message at 14:11 on Oct 17, 2020

shrike82
Jun 11, 2005

lol dude why do you need to pretend to know ML to win an argument

ijyt
Apr 10, 2012

How many tensor cores does my bot need to get me a 3080 FE in the UK.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Truga posted:

just stop replying to paul lol

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
why are you booing me? I’m right.

Rolo
Nov 16, 2005

Hmm, what have we here?
So this shitshow was the first time I tried to get popular tech on release. Is it going to be about the same trying to get a zen3? I’m on a Ryzen 7 3700x so I’m not in a rush but some friends are going to be trying.

Sagebrush
Feb 26, 2012

Two people who both managed to download and compile TensorFlow slapping the poo poo out of each other over which one is the true genius AI expert :allears:

Adbot
ADBOT LOVES YOU

track day bro!
Feb 17, 2005

#essereFerrari
Grimey Drawer

Kraftwerk posted:

Well everyone knew there’s no way I’m waiting till Sunday for this.
I’m free at last! I didn’t even need to F5 for this.


Nvidia GameReady Dog™

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply