Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I get it man, you're dead inside. I'm legitimately sorry for that. I wish you the best.

Adbot
ADBOT LOVES YOU

shrike82
Jun 11, 2005

It’s the first time I’ve heard someone saying being the son of an engineer is the reason why they’re excited about a graphics card launch but more power to you man.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Taima posted:

The craziest thing to me (besides the fact that DLSS2 is better than native, which still amazes me) is that Nvidia's ambitious vision for gaming GPUs is coming true all at once.

Nvidia basically sacrificed Turing to make a gigantic leap ahead. It seemed like lunacy at the time, but goddamn if it doesn't look like 4D chess in 2020.

The final unknown piece of the Ampere puzzle is RTX. If the rumors are true, and RTX is vastly more efficient in Ampere, that constitutes the completion of their grand scheme. The stage is set for Ampere to be something really special. And DLSS will help ensure that Ampere is viable long past the normal shelf lives of high-end cards as well.

Besides the RTX question, it will also be interesting to see what skus are actually made on TSMC 7nm. That's just icing on the cake, but let's hope at least the 3080Ti/3080 make it to market on their fab.

I sound like such a fanboy and maybe I am at this point but I have never been so excited for a GPU launch. I've been through them all. I was lucky enough to be raised in silicon valley, my father was an engineer, so I always had the hotness from the beginning (through no merit of my own). So like many of you I've been around the block a few times with GPU launches. That being said, It seems like so many loose ends are coming together at the same time in a way that is going to produce a spectacular product. I can't wait for the 31st.

It's also looking like Ampere/RDNA 2 may be a short generation and chiplet based architectures (Hopper and RDNA3 respectively) are coming next year, which is yet another quantum leap. Like, in 18 months you might be able to get close to double the performance per dollar and double the absolute performance you can right now (between shrinks, DLSS, and chiplet). This is a really interesting little blip in the moore's law curve, things have been stagnating a lot as shrinks have died a slow death but finally they might be breaking loose.

(yeah let's get that "wait for next gen" started early! :regd08:)

Paul MaudDib fucked around with this message at 06:01 on Aug 12, 2020

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I think the list looks like this, but I might be missing a 6xx entry and the order is possibly ahistorical.

Matrox Millennium
Matrox G200
GeForce 256
(Whatever they called the chip in the SGI Indy)
GeForce 3
Radeon 9700
Radeon 9800 Pro
780Ti
970
2x970
2x980Ti
1080

And yeah, my interest in Ampere is piqued like I haven’t felt since we first saw programmable shaders, really.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

shrike82 posted:

It’s the first time I’ve heard someone saying being the son of an engineer is the reason why they’re excited about a graphics card launch but more power to you man.

My easily understood point was that I've seen all the GPU launches due to being born in those circumstances (otherwise would have been impossible, I'm too young) so being uniquely stoked for this one after seeing all of the others is a bigger deal, in my mind at least.

Paul MaudDib posted:

It's also looking like Ampere/RDNA 2 may be a short generation and chiplet based architectures (Hopper and RDNA3 respectively) are coming next year (yeah let's get that "wait for next gen" started early! :regd08:), which is yet another quantum leap. Like, in 18 months you might be able to get close to double the performance per dollar and double the absolute performance you can right now (between shrinks, DLSS, and chiplet). This is a really interesting little blip in the moore's law curve, things have been stagnating a lot as shrinks have died a slow death but finally they might be breaking loose.

Completely agreed man. Exciting times!!

MikeC
Jul 19, 2004
BITCH ASS NARC

Taima posted:

The craziest thing to me (besides the fact that DLSS2 is better than native, which still amazes me)

From what I have read and seen from screenshots, while you gain greater detailing than native in some cases, the process also introduces artifacts that rendering in native resolution doesn't. The fact that you can get an acceptable image quality upscaling to gain large FPS gains when compared side by side with native is amazing. I agree there. But to say it is strictly better than native....that is a stretch.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
That is a good point, I shouldn't phrase it like it's 100% perfect when it's not.

In practice though, in the DLSS2 games I've played so far personally, it's effectively perfect to my eyes. That is completely subjective, however. Regardless, you're completely right.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I thought “native resolution rendering” was mostly a polite fiction these days with the various scaling of render targets to hit frame budgets and whatnot.

MiniSune
Sep 16, 2003

Smart like Dodo!
Envy is a CGA owner looking at his friends EGA PC.

Meanwhile the Amiga owners laughed at us both and lorded over us with their flashy graphics and awesome sound.

But we showed them in the end.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

god, Ampere is going to be able to slowly bounce so many mirrored balls over a checkerboard plane...

Craptacular!
Jul 9, 2001

Fuck the DH
Lemme see...

Voodoo (Monster 3D)
TNT2
GeForce 256 DDR
GF4 Ti4400
Radeon 9700 PRO
6800 GT
7600 GT
Some cheap Radeons because I wrote PC gaming off for dead between 2007-2012, one was passively cooled and that was very neat
GTX460
GTX660
RX470 4GB
GTX1070 and still waiting

TOOT BOOT
May 25, 2010

I don't even remember what card I had 5 years ago, let alone 15 or 20.

eggyolk
Nov 8, 2007


A not insignificant portion of my brain is still bound by an intense desire to own a Geforce 6800 Ultra and it kind of makes me sad.

eggyolk
Nov 8, 2007


Look at these benchmarks though, holy poo poo.

Shrimp or Shrimps
Feb 14, 2012


eggyolk posted:

A not insignificant portion of my brain is still bound by an intense desire to own a Geforce 6800 Ultra and it kind of makes me sad.

Weren't there some 6800GS's or some such that could be unlocked to an ultra? Or am I misremembering? For some reason I think it must be a Gainward card.

Craptacular!
Jul 9, 2001

Fuck the DH

eggyolk posted:

A not insignificant portion of my brain is still bound by an intense desire to own a Geforce 6800 Ultra and it kind of makes me sad.

The Ultra was only very slightly better than the GT and cost like $550 compared to the GT's $450.

edit: But yeah, my 6800 GT was so good that I bought it in 04 and sold it in 06 not because the games were too much for it, but because it was impossible to get AGP on a motherboard that wasn't a few ASRocks that weren't worth a drat.

Cygni
Nov 12, 2005

raring to post

Subjunctive posted:

I think the list looks like this, but I might be missing a 6xx entry and the order is possibly ahistorical.

Matrox Millennium
Matrox G200
GeForce 256
(Whatever they called the chip in the SGI Indy)
GeForce 3
Radeon 9700
Radeon 9800 Pro
780Ti
970
2x970
2x980Ti
1080

I think this is the most consistently baller lineup I’ve seen posted so far, although I’m really diggin seeing these parts I’ve totally forgotten about like the X800GTO

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
Man, I've never bought a top-tier video card. I think the closest I ever got was the Geforce 4 Ti4400, back when the lineup was 4200/4400/4600. I've just doggedly refused to spend $300 or more on a video card, ever since high school when I thought my friend was nuts for dropping like $350 on a Geforce 3.


I think the 1060 is probably the last card I'm going to get away with that on though. Fuckin' video cards.

CaptainSarcastic
Jul 6, 2013



Farmer Crack-rear end posted:

Man, I've never bought a top-tier video card. I think the closest I ever got was the Geforce 4 Ti4400, back when the lineup was 4200/4400/4600. I've just doggedly refused to spend $300 or more on a video card, ever since high school when I thought my friend was nuts for dropping like $350 on a Geforce 3.


I think the 1060 is probably the last card I'm going to get away with that on though. Fuckin' video cards.

That was true for me. Going to a 27" 1440p monitor finally made me drop $500 for a GPU that can drive it properly.

shrike82
Jun 11, 2005

Reviewing Ampere will be logistically interesting - tech sites are going to have compare rasterization, RT, and DLSS when there aren't many games out that support the latter two. Not to mention launch seems likely to happen before CP2077 is out.

I'm also surprised at the dearth of A100 review/benchmarks - there's been nothing since that unofficial Octabench tweet.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

Subjunctive posted:

I thought “native resolution rendering” was mostly a polite fiction these days with the various scaling of render targets to hit frame budgets and whatnot.

True, but it hasn't stopped pro-AMD youtubers from dismissing DLSS as a software cheat. It'll be pretty telling to see which reviewers do or don't include DLSS in their reviews in a few month's time.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin
Here we go:
    Diamond Stealth 64
    Matrox Mystique
    Orchid Righteous 3D (3DFX Voodoo 1)
    3DFX Voodoo II 12MB x 2
    Riva TNT2 Ultra
    Guillemot 3D Prophet II Geforce 2 GTS
    Geforce FX 5600
    (6-year dark age)
    EVGA Geforce GT 220
    EVGA Geforce GTX 570 1.25GB x 2
    EVGA Geforce GTX 780 Ti x 2
    EVGA Geforce GTX TITAN X (Maxwell) x 2
    EVGA Geforce GTX 1080 Ti
    Nvidia Geforce RTX 2080 Ti FE

Anime Schoolgirl
Nov 28, 2002

Anime Schoolgirl posted:

While we're at it we might as well post our video card/display adapter history:

1994-1998: a number of S3 display adapters
1999: Voodoo 2
2001--2003: Geforce 2 MX/Geforce 4 MX
2004-2005: Radeon 9800 Pro
2006-2007: Radeon X1650 Pro
2008: Geforce 8800GT
2009-2010: GTX 275 triple kill :suicide:
2010-2011: Radeon HD 5770
2012: Radeon HD 6670 + Llano dual graphics :laffo:
2013-2015: Radeon HD 7870
2014-present2016: Geforce 750 Ti (secondary PC)
Present2016: Radeon R9 290 reference hair dryer edition (killed by coffee)
2016: Geforce GTX 1070
2020: Geforce RTX 2070 Super

FuturePastNow
May 19, 2014


redreader posted:

When was it actually true that Macs had better graphics than PC's? Because as soon as graphics cards started being made like the voodoo and Riva TNT and whatever was before that, it must have stopped being true. But someone repeated that to me again in maybe 2007.

Macs had the same graphics cards as PCs. The iMac had a Rage 128 (or a Rage XL? Rage something). Power Mac G4s mostly had Geforce MX 2 / MX 3 cards, with some faster options available, the G5s had cards ranging from the FX 5200 to Radeon 9600 to Quadro cards. After that you get into the Intel Macs, which still use the same GPUs as any other computer, ranging from Intel IGPs to modern Radeons. The hardware is all custom but the actual GPU silicon is the same.

These cards had different firmwares so they could speak to the OpenFirmware that PowerPC Macs used instead of BIOS and the EFI used by the Intel Macs, but of course cards can be flashed one way or the other.

So yeah, any graphical differences came down to software.

Shrimp or Shrimps posted:

Weren't there some 6800GS's or some such that could be unlocked to an ultra? Or am I misremembering? For some reason I think it must be a Gainward card.

Yep. I had a 6800GS that I unlocked into an Ultra. I think it was a XFX card. This could be done with all the AGP cards which used the same NV40 chip as the Ultra, they cut it down by just turning off four of the pipes in a way that was easily un-done with Rivatuner. Nvidia learned from this mistake.

This couldn't be done with the PCIe 6800GS, those used the NV41 and had no hidden units to enable.

FuturePastNow fucked around with this message at 09:43 on Aug 12, 2020

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I haven't actually read anyone's GPU lists here

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

ConanTheLibrarian posted:

True, but it hasn't stopped pro-AMD youtubers from dismissing DLSS as a software cheat.

Oh wow. Such denial. Seriously, a "cheat"? It's clearly a well implemented optimisation, and it's not forced upon the user.
I've probably purchased more AMD cards historically than NVIDIA ones due to price/performance, but it means nothing. Why do people shill for a particular brand? It's not like they get anything out of it

Truga
May 4, 2014
Lipstick Apathy

Nfcknblvbl posted:

I had buddies who used the Glide wrapper to play Quake 2 in that mode. I preferred D3D mode myself since Glide washed out the textures a lot, and gave it that Nintendo 64 smeared look.

Edit: D3D != software.

i played q2 on software for a very long time because GL featured a low max draw distance, which made it impossible to see/snipe people across some big maps. someone hacked that eventually but for a long time it was really annoying trying to play with acceleration lol

Craptacular!
Jul 9, 2001

Fuck the DH

HalloKitty posted:

Oh wow. Such denial. Seriously, a "cheat"? It's clearly a well implemented optimisation, and it's not forced upon the user.
I've probably purchased more AMD cards historically than NVIDIA ones due to price/performance, but it means nothing. Why do people shill for a particular brand? It's not like they get anything out of it

Even if there are two options in graphics card, if one is total Rajatrash and sells zero units then there is no competition since one option is just a waste of everyone’s time. So if these bloggers exist, they usually fiercely promote AMD because their own card history is a long line of GeForce cards in the past few years, and they want to stay on team green without losing all their money.

People talk about the lack of competition, but things are pretty good for the past six months or so. The vast amount of the past twenty years in GPUs we have seen “The Card Everybody Should Buy” and “The Card You Wouldn’t Recommend To Anyone”. Which company makes which has flopped back and forth a few times over the years, but every dud that makes it off a lab floor will eventually be shilled for by someone who wouldn’t buy it.

repiv
Aug 13, 2009

DLSS is cheating, but it's cheating in exactly the same way that native resolution with TAA is cheating. Both are taking an undersampled signal and reconstructing it up to an approximation of the ground truth, the only difference is how effectively they do it.

Purists are welcome to supersample their games 8x or more to hit the sweet sweet nyquist rate without any cheats, and with low single digit frames per second

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

ConanTheLibrarian posted:

True, but it hasn't stopped pro-AMD youtubers from dismissing DLSS as a software cheat. It'll be pretty telling to see which reviewers do or don't include DLSS in their reviews in a few month's time.

HalloKitty posted:

Oh wow. Such denial. Seriously, a "cheat"? It's clearly a well implemented optimisation, and it's not forced upon the user.
I've probably purchased more AMD cards historically than NVIDIA ones due to price/performance, but it means nothing. Why do people shill for a particular brand? It's not like they get anything out of it

Guess the argument is that right now only a handful of games support it. Cyberpunk will make it harder to sideline.

Warmachine
Jan 30, 2012



Zedsdeadbaby posted:

I haven't actually read anyone's GPU lists here

I've generally skimmed them, but I can't actually remember what I had besides my current 970 (in ITX form factor :shobon:).

Except that I know I swing back and forth between ATI/AMD and Nvidea like a pendulum. If I was going to keep to that, I'd be holding out for RDNA 2, but gently caress that. I'm fiending to get a new card to finish my build and drat it hurry up. :shepspends:

shrike82
Jun 11, 2005

If you want to hit that nostalgia, DF has a retro series of videos benchmarking old games with era appropriate hardware using contemporary frame measurement tools -

https://www.youtube.com/watch?v=PALxS8O4HHI
(Oblivion on a Q6600/8800GT)

Enos Cabell
Nov 3, 2004


I picked up a 2070s last week, haven't decided yet if I'm going to keep it for my living room vr setup or step-up for my main rig, but I can definitely say that dlss 2.0 is worthy of the hype. Been playing Control this week w/ RTX and all settings on high at 1440p, dlss at 720p and seem to be averaging around 65-75fps. Native rendering was 35-45fps. Really wish it had an in game benchmark.

Mercrom
Jul 17, 2009

MikeC posted:

From what I have read and seen from screenshots, while you gain greater detailing than native in some cases, the process also introduces artifacts that rendering in native resolution doesn't. The fact that you can get an acceptable image quality upscaling to gain large FPS gains when compared side by side with native is amazing. I agree there. But to say it is strictly better than native....that is a stretch.

No artifact compares to aliasing and native still has to run AA to not look like poo poo. TAA is the best currently but it makes things blurry in motion which seems way worse compared to the artifacts from DLSS. The videos I've seen from Digital Foundry make it seem like DLSS actually is strictly superior to native resolution.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
Onboard S3 Trio32/64
Matrox Mystique
3Dfx Voodoo Graphics
3Dfx Voodoo3
PowerVR Kyro II
Geforce3
GeForce FX 5900XT
GeForce 7800GS
GeForce 9600GT
GeForce GTX 550 Ti
GeForce GTX 660
Radeon RX 480
GeForce GTX 1070 Ti
Radeon Vega Frontier Edition

This does not include all the other systems I put together because I was bored along the way... if you can name a graphics card make and model from the last twenty-five years, I’ve probably at least played with it.

Truga
May 4, 2014
Lipstick Apathy

repiv posted:

DLSS is cheating, but it's cheating in exactly the same way that native resolution with TAA is cheating. Both are taking an undersampled signal and reconstructing it up to an approximation of the ground truth, the only difference is how effectively they do it.

Purists are welcome to supersample their games 8x or more to hit the sweet sweet nyquist rate without any cheats, and with low single digit frames per second

from what i can see DLSS is way better than TAA tbh, because TAA looks like absolute garbage compared to even no loving AA no matter what i try, but videos of DLSS... mostly don't.

e: like, i'll take no AA jaggies over TAA blur any loving day but DLSS look ok enough.

repiv
Aug 13, 2009

YMMV because TAA doesn't refer to any specific algorithm, just a general concept with dozens of subtly different implementations, but DLSS looks better than at least some TAAs yeah

We don't have a great sample size to compare DLSS to a wide variety of TAA flavours yet

repiv fucked around with this message at 14:59 on Aug 12, 2020

ufarn
May 30, 2009
A big issue with TAA (and FXAA, and both at the same time) is how rarely it gets combined with a sharpening filter. TAA without sharpening is an objectively awful vaseline layer. Also, a lot of companies seem to just implement it in horrible ways, not that I know what the capacity for screwing up TAA implementations is.

Truga
May 4, 2014
Lipstick Apathy
i played a game with friends for a couple months this spring that had both a really powerful sharpening filter that couldn't be disabled because it'd just get enabled again on next loading screen even when turned off in config files, and a TAA that made everything blurry.

the end result was that playing without TAA looked pretty bad, and playing with TAA looked worse

i tried dicking around with nvidia sharpening and poo poo, but nothing helped.

the final solution was to drop to potato mode settings rather than "remastered" option that makes it actually look decent, because that didn't have the sharpening filter or TAA options for some reason? lmao

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Intel launching enthusiast+midrange GPUs with raytracing in 2021 based on "external foundry".

https://i.imgur.com/7drHiqr.gifv

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply