Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SCheeseman
Apr 23, 2003

Lockback posted:

Someone is still bitter they didn't have a Voodoo Rush and convinced themselves of SOME LIES

Can your Voodoo Rush run the game at 1024x768? :smug:

Adbot
ADBOT LOVES YOU

feedmegin
Jul 30, 2008

FBS posted:

What happened 21 years ago? Did they even have graphics cards in 1999?

...do you think we were still using green screen text terminals like it was 1980? :shobon:

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

feedmegin posted:

...do you think we were still using green screen text terminals like it was 1980? :shobon:

99 didn't just have GPUs, it even had Mac GPUs! It was my first GPU ever, an ATI Rage, just in time for Unreal Tournament

That was the like the first and last time a Mac shipped with a competitive GPU

Chimp_On_Stilts
Aug 31, 2004
Holy Hell.

SCheeseman posted:

Mech 2 looked best with software rendering IMO. The low resolution, repeating textures slapped on to massive landscapes didn't do the game a lot of good.

I refuse to accept you dishonoring the memory of the first game I ever played with a 3D accelerator and would challenge you under the ritual of Zellbrigen if it wasn't such a pain in the rear end to get MW2 working on netplay.

My rose colored memories recall the game being loving gorgeous.

Rap Game Goku
Apr 2, 2008

Word to your moms, I came to drop spirit bombs


Zero VGS posted:

99 didn't just have GPUs, it even had Mac GPUs! It was my first GPU ever, an ATI Rage, just in time for Unreal Tournament

That was the like the first and last time a Mac shipped with a competitive GPU

I bought my Blue & White G3 in that 2 weeks where the Rage 128 was king. Been downhill ever since.

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum
When was it actually true that Macs had better graphics than PC's? Because as soon as graphics cards started being made like the voodoo and Riva TNT and whatever was before that, it must have stopped being true. But someone repeated that to me again in maybe 2007.

LRADIKAL
Jun 10, 2001

Fun Shoe
It has more to do with software availability and optimization and integration of said software with the available hardware. i.e. if all the best producers with the most money buy macs and use a particular piece of mac software, and the best hardware add-ons are mac compatible, then you end up with the best production tool chain in spite of potentially "worse" hardware.

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

Maybe never? The software was better for graphics, system level and the ecosystem.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

redreader posted:

When was it actually true that Macs had better graphics than PC's? Because as soon as graphics cards started being made like the voodoo and Riva TNT and whatever was before that, it must have stopped being true. But someone repeated that to me again in maybe 2007.



It was true for like 1 release of the iMac, but it legit was a good gaming PC. Back then 1 year made a huge difference so it didn't stay relevant for super long.

Ugly In The Morning
Jul 1, 2010
Pillbug

Lockback posted:



It was true for like 1 release of the iMac, but it legit was a good gaming PC. Back then 1 year made a huge difference so it didn't stay relevant for super long.

In the 90’s I remember a computer that was 2 years old being positively useless for new games, that was insane. I bought my own computer for the first time in 2001 and was able to at least squeeze four years out of it with GPU and RAM upgrades.

Ugh, one of those upgrades was the 5200FX. What a crap card. I think I only had that for a year.

akadajet
Sep 14, 2003

I just remembered this:


lol

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Ugly In The Morning posted:

In the 90’s I remember a computer that was 2 years old being positively useless for new games, that was insane. I bought my own computer for the first time in 2001 and was able to at least squeeze four years out of it with GPU and RAM upgrades.

Ugh, one of those upgrades was the 5200FX. What a crap card. I think I only had that for a year.

Yeah, in like the 97-2000 range you'd spend $2000 in 90s bucks on a PC and it'd be obsolete for games within 18 months. It was brutal. You could play games in software mode though and a bunch of people deluded themselves into thinking it was the same thing/better *ahem*

Somewhere around the Geforce2 time frame or something things got better. Even the TNT2 held its own for a while. But yeah it was pretty nuts.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Ah yea the bad old days of budget GPUs literally being absolutely loving useless. Like you'd go out and buy a Radeon 9200SE or an FX5200 and it could not run any game in the preceding two years properly unless you put it at like 320x240 low settings. There was nothing to justify their existence, if you tried to do a budget spec you would just go to bed pissed off with your money gone up in smoke. I'm glad we've moved on from those times. Now if you buy something like a 1650 or even a 1050ti you know it's low-end but it's still going to run games decently well at 720p-1080p high.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Geforce 256, didn't that have some problems or was underwhelming or something? After I typed Geforce2 I was trying to remember why I thought that card was so, so much better than the 256. Maybe the Voodoo3 was just a better value against the 256 or something?

Ugly In The Morning
Jul 1, 2010
Pillbug

Zedsdeadbaby posted:

Ah yea the bad old days of budget GPUs literally being absolutely loving useless. Like you'd go out and buy a Radeon 9200SE or an FX5200 and it could not run any game in the preceding two years properly unless you put it at like 320x240 low settings. There was nothing to justify their existence, if you tried to do a budget spec you would just go to bed pissed off with your money gone up in smoke. I'm glad we've moved on from those times. Now if you buy something like a 1650 or even a 1050ti you know it's low-end but it's still going to run games decently well at 720p-1080p high.

As far as the FX5200 goes, that whole line was a shitshow. I was able to get KOTOR running acceptably, somehow, but I got rid of that card as soon as I could and replaced it with the Radeon 9800 that I basically turned into a 9800 pro with BIOS shenanigans. If I hadn’t replaced the whole computer I probably could have kept that card going for ages.

Cygni
Nov 12, 2005

raring to post

It was the first card with a hardware texture and lighting engine, so it took a while for enough games to support it to make a difference and the D3D drivers were absolute garbo when it first launched. People who had already purchased a TNT2 Ultra for the eye watering price of $300 were miffed that Nvidia offered a whole new architecture so soon after that didn't do enough in D3D. Also everyone widely mocked Nvidia's attempt to rebrand graphics cards as "GPUs" at the time, lol.

Also it was released right on the transition from SDR to DDR when DDR was super expensive. By the time the Geforce 2 launched 4 months later (ohhhh these were the days), DDR prices had fallen enough to make it more mainstream and the bandwidth jump was huge.

I had a Geforce 256 SDR from Hercules(!) that i kept for a long long time. Was a great card that aged exceptionally well by mondern standards.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Yeah, that seems right. I think it was a cool card that just didn't have a ton of value at the time, but the Geforce2 was a lot better. I may be remembering people happy they waited and THEN making fun of the scrubs who got the 256 vs hating on the card immediately.

Carecat
Apr 27, 2004

Buglord

BIG HEADLINE posted:

What worries me is that we'll get a release with SKUs that have 10-12GB of frame buffer, and then the "Super" refreshes will double it.

Then again, if they put out a 24GB 3080Ti for $1500, I might bite.

Surely not, GDDR is pretty expensive right?

Edit: I find one place claiming it costs $22 for 8GB but it's more like $10 a GB from Micron which sounds believable.

Carecat fucked around with this message at 22:09 on Aug 11, 2020

Arzachel
May 12, 2012
I was stuck on a Prescott + FX5200 system for seven years. Some games had very low presets that disabled shaders so I could get a playable double digit frame rate!

Ugly In The Morning
Jul 1, 2010
Pillbug
I can’t believe the FX5200 didn’t even have a fan in a lot of configurations, just a big ol’ finned heatsink.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Carecat posted:

Surely not, GDDR is pretty expensive right?

Edit: I find one place claiming it costs $22 for 8GB but it's more like $10 a GB from Micron which sounds believable.

Be careful you're looking at GB vs Gb. Most of the pricing tables I've seen are in Gb, or 1/8 a GB. ~$20/8Gb would be expensive, but not impossible for GDDR6. $22/8GB is cheaper than GDDR5 pricing.

DrDork fucked around with this message at 22:33 on Aug 11, 2020

shrike82
Jun 11, 2005

I grew up with lovely video cards - Trident, Cirrus Logic cards and the S3 Virge. The odd thing is there’s a gap in my memory between the Virge and the first card I bought with my own money when I started working (660 Ti). I built several PCs for gaming in college (2004-2008) but don’t remember the parts I used - anyone remember what was cheap and mainstream then?

I should probably dig through my email to see if I can find order receipts.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

shrike82 posted:

I grew up with lovely video cards - Trident, Cirrus Logic cards and the S3 Virge. The odd thing is there’s a gap in my memory between the Virge and the first card I bought with my own money when I started working (660 Ti). I built several PCs for gaming in college (2004-2008) but don’t remember the parts I used - anyone remember what was cheap and mainstream then?

Radeon 9700 wasn’t super cheap, except by today’s standards, but it’s the card I most remember from that rough window.

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC
My mother's work gave away their old work PCs, so I had some old Win 98 PC that I played Tie Fighter and X-Wing Alliance on. No idea of it's specs, but it couldn't handle a burned copy of Quake 2 my brother's friend gave to him. Poor thing booted the game and died.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
The 9700 was a 2002 era card, but was definitely a great buy. Probably the first GPU that was still relevant years after its release. The Geforce 7800 GT came out in 2005 and IIRC Nvidia was pretty dominant with the 7000 and 8000 series until AMD launched TeraScale with the 4000 series in 2008.

Cygni
Nov 12, 2005

raring to post

shrike82 posted:

I grew up with lovely video cards - Trident, Cirrus Logic cards and the S3 Virge. The odd thing is there’s a gap in my memory between the Virge and the first card I bought with my own money when I started working (660 Ti). I built several PCs for gaming in college (2004-2008) but don’t remember the parts I used - anyone remember what was cheap and mainstream then?

I should probably dig through my email to see if I can find order receipts.

I did that recently and had forgotten a few cards. My primary computer only cause at some point i started getting... more... and more computers.

Paradise ISA
S3 Virge DX
Voodoo Rush
Voodoo Banshee
i740+Voodoo 2
Geforce 256 SDR
Radeon 8500 LE
Radeon 9600 Pro
Geforce 6600 GT
Radeon 4850
Radeon 7770 Ghz Edition
Geforce GTX 960
Geforce GTX 1060 6Gb
Geforce RTX 2080

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

I missed out on the entire 3DFx series something that bothered me quite a bit. Used to fantasize about having a Voodoo 2 or Riva TNT2.

I started a 2MB ATI Rage II built into the mobo.
I jumped from that to a P4 with the Geforce 2 MX400.
Then:
ATI Radeon X800
ATI Radeon 5770XT
GTX 1070

My next card is definitely gonna be a 3080TI I'm preordering with EVGA if that's even possible.

Ugly In The Morning
Jul 1, 2010
Pillbug
As far as cards in computers I personally owned goes:

NVidia Vanta (lol)
NVidia FX5200 (lmao)
ATi Radeon 9800 “Pro” (now we’re talking)

2X NVidia 7800 GTX in SLI
—-
GeForce 9600 GT
Radeon HD5750

Whatever garbage is in an Alienware Alpha

1660 Ti in my laptop and 2070 Super in my desktop.

Indiana_Krom
Jun 18, 2007
Net Slacker
My first real 3D accelerator was a Voodoo2 8MB, folloed by:
Voodoo3 3000
Geforce 3
5900 Ultra
6800 GT AGP with ultra bios
7950 GT
8800 GT
680
980
currently on a 1080

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"
Voodoo 2
GeForce MX440
Radeon 9600XT (first time I ever bought a GPU just for one game, HL2)
Radeon 4890
GeForce 560
GeForce 970
GeForce 1080Ti

Next up is a 3080Ti just for Cyberpunk 2077.

I swear there was a card between the 9600XT and the 4890 but I'm drawing a blank right now. The first 2 cards were in family computers that I upgraded under careful guidance of my parents who knew nothing about computers with the 9600XT being in my first computer build as a teen.

Cavauro
Jan 9, 2008

After slumming it for a long time before buying an 8800GT i've been buying a '70' Nvidia card or equivalent, skipping one generation, then buying the next '70' Nvidia card. let's all take care of eachother and our gpus.

Ugly In The Morning
Jul 1, 2010
Pillbug
Normally I would skip a generation but even with DLSS, RT is demanding enough that I’m probably bumping up to a 30 series and maybe even doing the 80 instead of the 70. I haven’t whaled out on graphics like that in 15 years.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

LRADIKAL posted:

It has more to do with software availability and optimization and integration of said software with the available hardware. i.e. if all the best producers with the most money buy macs and use a particular piece of mac software, and the best hardware add-ons are mac compatible, then you end up with the best production tool chain in spite of potentially "worse" hardware.

This is spot on.

The "Macs are better for graphics" era definitely existed, but it was in the late 80s to the mid 90s, and it was never really a gaming or performance thing. PC graphics hardware was a mess of incompatible standards with weird performance and feature gaps, and the limited Apple hardware set made for a comparatively easy and stable target. Photoshop and PageMaker (now InDesign) started as Mac-only products, and even after the Windows ports came out, Mac users were first-class citizens lording it over the Windows folks for quite a while. Mac OS's font handling was also way better than anything you'd get on DOS or Windows for a long, long time - not a big deal for most users, but essential for anybody trying to lay a page out to exact picas and points. If you were serious about any kind of print work, a Mac was absolutely necessary.

By the time consumer 3D cards started to become commonplace in the late 1990s and early 2000s, though, Apple's hardware was nothing special, and the pro graphics situation on Windows made it to close-enough feature parity. Windows' own font handling stayed garbage for a long time, but any applications for people who cared included their own rendering engine and solved the problems themselves. At that point Apple was mostly coasting on the momentum of designers and others who'd learned to work on their software and didn't want to change.

Shrimp or Shrimps
Feb 14, 2012


Cygni posted:

I did that recently and had forgotten a few cards. My primary computer only cause at some point i started getting... more... and more computers.

Paradise ISA
S3 Virge DX
Voodoo Rush
Voodoo Banshee
i740+Voodoo 2
Geforce 256 SDR
Radeon 8500 LE
Radeon 9600 Pro
Geforce 6600 GT
Radeon 4850
Radeon 7770 Ghz Edition
Geforce GTX 960
Geforce GTX 1060 6Gb
Geforce RTX 2080

I love it every time the post your graphics cards game comes up!!

ATI 3D Rage Pro
Geforce 2 GTS 32mb
Radeon 9800 Pro
Geforce 6600GT
Radeon X800GTO2 (flashed for 16 pipes)
Radeon X1900XT
Geforce GTX295
Geforce GTX 580
Radeon HD7870
GTX1080
GTX1060 (laptop)

Q_res
Oct 29, 2005

We're fucking built for this shit!
I forgot some of the huge gaps I had in my PC ownership.

Riva 128ZX
GeForce 2 MX400
GeForce FX 5600XT
Radeon X800 GTO
Radeon HD 4870
Radeon HD 6870
GeForce GTX 760
GeForce GTX 960m
GeForce GTX 1060
GeForce RTX 2070

CaptainSarcastic
Jul 6, 2013



The first computer I personally bought was a Pentium II running Windows 98, and I honestly can't remember the videocards I had in that one. I do remember it started with a pass-through card, and I upgraded it, but no recollection of the cards involved. Then I got a Pentium 4 that I think started with a GeForce 2MX, then I went to a Radeon 9600 Pro AIW and was good for years.

After that my memory gets a little better (this is my main desktops and not secondary/project boxes):

GeForce 6800 GT
GeForce 9500 GT
GeForce 9800 GTX
GeForce GTX 260
GeForce GTX 460
GeForce GTX 660
GeForce GTX 1060 6GB
GeForce RTX 2070 Super

I've owned and used a bunch of other, lesser cards but not in my main machine (including a PCI FX 5200, which I still have).

FBS
Apr 27, 2015

The real fun of living wisely is that you get to be smug about it.

This may explain my ignorance of 20th-century cards:

Radeon 9800 Pro (this was in a prebuilt family PC but my parents let me do the shopping)
GeForce 8800 GTS 320MB in my first personal PC
Radeon HD 6770 (lol) which I only bought for Skyrim
GeForce GTX 1080 which I've been stuck with ever since, thanks Nvidia

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

MikeC posted:

While I am not a technical guru, I have read and heard from others that while AMD cannot use image reconstruction in the form of DLSS, there are other methods of image reconstruction available that AMD could utilize for a DLSS-like feature that does not require the use of tensor cores found on the Nvidia lineup. It might not be as good but it may be another case, like power efficiency, where 'good enough ' will get the job done with respect to matching features.

As it is, CP2077 ran at 60fps at 1080p with ray tracing on................using a 2080TI. This is what I was getting at with the 2060 and 2070 class of cards probably being tossed when and if ray tracing hits the big time in games in the coming year. This is also why in the partpicker thread I tell people not to buy anything right now if ray tracing is what they want.

This to me represents AMD's window back into this market.

Maybe AMD can implement something like DLSS 1.9 where it was running the reconstruction on the shader cores rather than using the tensors. I don't know how much speedup it had relative to 2.0, though, and it was only ever implemented in Control.

It probably won't ever be as fast as having dedicated tensors but it might be less impactful on AMD cards like Vega which are typically bottlenecked by the fixed function parts of the pipeline long before they hit the shaders. A pipeline bottleneck means that shader processing is "free" in a sense, as long as it doesn't hit memory or other shared resources very much. This is a fun tradeoff you can make on GPUs - it is often more optimal to recompute some data rather than storing it and accessing it when you need it, because processing cycles are cheap compared to memory hits.

Also, I think AMD has their own equivalent of tensor cores in CDNA now? I would expect those to make an appearance in RDNA 3, it's a bit too soon for RDNA 2 (and I'm sure Sony/MS would have bragged about it if it were in there) but AMD likely knew NVIDIA was doing something with the tensor cores a year or two before it was public, and it's been almost 2 years since NVIDIA publicly announced the concept.

Radeon Image Sharpening is not anything close to DLSS 2.0 unfortunately, no matter how much people want it to be. It's basically just a sharpening filter and that has pretty well-understood benefits and drawbacks. In particular it tends to introduce ringing artifacts around high-contrast areas, some people perceive this as "extra detail" but it's actually glitches caused by the sharpening, it's not in the actual game itself, it's like punching the "sharpening" slider to the max in witcher 3, the game just crawls with artifacts. The problem is most people suck at critical analysis of images (and video/audio/etc), as we saw with the whole "radeon has better colors!" meme and will happily insist it's better.

But on the whole - like GSync, this is an area where NVIDIA has pushed the state of the art and caught everybody else flat-footed. It'll take some time to copy their work in a way that evades patents/etc.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
The craziest thing to me (besides the fact that DLSS2 is better than native, which still amazes me) is that Nvidia's ambitious vision for gaming GPUs is coming true all at once.

Nvidia basically sacrificed Turing to make a gigantic leap ahead. It seemed like lunacy at the time, but goddamn if it doesn't look like 4D chess in 2020.

The final unknown piece of the Ampere puzzle is RTX. If the rumors are true, and RTX is vastly more efficient in Ampere, that constitutes the completion of their grand scheme. The stage is set for Ampere to be something really special. And DLSS will help ensure that Ampere is viable long past the normal shelf lives of high-end cards as well.

Besides the RTX question, it will also be interesting to see what skus are actually made on TSMC 7nm. That's just icing on the cake, but let's hope at least the 3080Ti/3080 make it to market on their fab.

I sound like such a fanboy and maybe I am at this point but I have never been so excited for a GPU launch. I've been through them all. I was lucky enough to be raised in silicon valley, my father was an engineer, so I always had the hotness from the beginning (through no merit of my own). So like many of you I've been around the block a few times with GPU launches. That being said, It seems like so many loose ends are coming together at the same time in a way that is going to produce a spectacular product. I can't wait for the 31st.

Adbot
ADBOT LOVES YOU

shrike82
Jun 11, 2005

lol

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply