|
Lockback posted:Someone is still bitter they didn't have a Voodoo Rush and convinced themselves of SOME LIES Can your Voodoo Rush run the game at 1024x768?
|
# ? Aug 11, 2020 17:54 |
|
|
# ? Apr 26, 2024 10:06 |
|
FBS posted:What happened 21 years ago? Did they even have graphics cards in 1999? ...do you think we were still using green screen text terminals like it was 1980?
|
# ? Aug 11, 2020 18:07 |
|
feedmegin posted:...do you think we were still using green screen text terminals like it was 1980? 99 didn't just have GPUs, it even had Mac GPUs! It was my first GPU ever, an ATI Rage, just in time for Unreal Tournament That was the like the first and last time a Mac shipped with a competitive GPU
|
# ? Aug 11, 2020 18:14 |
|
SCheeseman posted:Mech 2 looked best with software rendering IMO. The low resolution, repeating textures slapped on to massive landscapes didn't do the game a lot of good. I refuse to accept you dishonoring the memory of the first game I ever played with a 3D accelerator and would challenge you under the ritual of Zellbrigen if it wasn't such a pain in the rear end to get MW2 working on netplay. My rose colored memories recall the game being loving gorgeous.
|
# ? Aug 11, 2020 18:17 |
|
Zero VGS posted:99 didn't just have GPUs, it even had Mac GPUs! It was my first GPU ever, an ATI Rage, just in time for Unreal Tournament I bought my Blue & White G3 in that 2 weeks where the Rage 128 was king. Been downhill ever since.
|
# ? Aug 11, 2020 18:23 |
|
When was it actually true that Macs had better graphics than PC's? Because as soon as graphics cards started being made like the voodoo and Riva TNT and whatever was before that, it must have stopped being true. But someone repeated that to me again in maybe 2007.
|
# ? Aug 11, 2020 18:29 |
|
It has more to do with software availability and optimization and integration of said software with the available hardware. i.e. if all the best producers with the most money buy macs and use a particular piece of mac software, and the best hardware add-ons are mac compatible, then you end up with the best production tool chain in spite of potentially "worse" hardware.
|
# ? Aug 11, 2020 18:46 |
|
Maybe never? The software was better for graphics, system level and the ecosystem.
|
# ? Aug 11, 2020 18:54 |
|
redreader posted:When was it actually true that Macs had better graphics than PC's? Because as soon as graphics cards started being made like the voodoo and Riva TNT and whatever was before that, it must have stopped being true. But someone repeated that to me again in maybe 2007. It was true for like 1 release of the iMac, but it legit was a good gaming PC. Back then 1 year made a huge difference so it didn't stay relevant for super long.
|
# ? Aug 11, 2020 21:04 |
|
Lockback posted:
In the 90’s I remember a computer that was 2 years old being positively useless for new games, that was insane. I bought my own computer for the first time in 2001 and was able to at least squeeze four years out of it with GPU and RAM upgrades. Ugh, one of those upgrades was the 5200FX. What a crap card. I think I only had that for a year.
|
# ? Aug 11, 2020 21:11 |
|
I just remembered this: lol
|
# ? Aug 11, 2020 21:13 |
|
Ugly In The Morning posted:In the 90’s I remember a computer that was 2 years old being positively useless for new games, that was insane. I bought my own computer for the first time in 2001 and was able to at least squeeze four years out of it with GPU and RAM upgrades. Yeah, in like the 97-2000 range you'd spend $2000 in 90s bucks on a PC and it'd be obsolete for games within 18 months. It was brutal. You could play games in software mode though and a bunch of people deluded themselves into thinking it was the same thing/better *ahem* Somewhere around the Geforce2 time frame or something things got better. Even the TNT2 held its own for a while. But yeah it was pretty nuts.
|
# ? Aug 11, 2020 21:15 |
|
Ah yea the bad old days of budget GPUs literally being absolutely loving useless. Like you'd go out and buy a Radeon 9200SE or an FX5200 and it could not run any game in the preceding two years properly unless you put it at like 320x240 low settings. There was nothing to justify their existence, if you tried to do a budget spec you would just go to bed pissed off with your money gone up in smoke. I'm glad we've moved on from those times. Now if you buy something like a 1650 or even a 1050ti you know it's low-end but it's still going to run games decently well at 720p-1080p high.
|
# ? Aug 11, 2020 21:17 |
|
Geforce 256, didn't that have some problems or was underwhelming or something? After I typed Geforce2 I was trying to remember why I thought that card was so, so much better than the 256. Maybe the Voodoo3 was just a better value against the 256 or something?
|
# ? Aug 11, 2020 21:17 |
|
Zedsdeadbaby posted:Ah yea the bad old days of budget GPUs literally being absolutely loving useless. Like you'd go out and buy a Radeon 9200SE or an FX5200 and it could not run any game in the preceding two years properly unless you put it at like 320x240 low settings. There was nothing to justify their existence, if you tried to do a budget spec you would just go to bed pissed off with your money gone up in smoke. I'm glad we've moved on from those times. Now if you buy something like a 1650 or even a 1050ti you know it's low-end but it's still going to run games decently well at 720p-1080p high. As far as the FX5200 goes, that whole line was a shitshow. I was able to get KOTOR running acceptably, somehow, but I got rid of that card as soon as I could and replaced it with the Radeon 9800 that I basically turned into a 9800 pro with BIOS shenanigans. If I hadn’t replaced the whole computer I probably could have kept that card going for ages.
|
# ? Aug 11, 2020 21:21 |
|
It was the first card with a hardware texture and lighting engine, so it took a while for enough games to support it to make a difference and the D3D drivers were absolute garbo when it first launched. People who had already purchased a TNT2 Ultra for the eye watering price of $300 were miffed that Nvidia offered a whole new architecture so soon after that didn't do enough in D3D. Also everyone widely mocked Nvidia's attempt to rebrand graphics cards as "GPUs" at the time, lol. Also it was released right on the transition from SDR to DDR when DDR was super expensive. By the time the Geforce 2 launched 4 months later (ohhhh these were the days), DDR prices had fallen enough to make it more mainstream and the bandwidth jump was huge. I had a Geforce 256 SDR from Hercules(!) that i kept for a long long time. Was a great card that aged exceptionally well by mondern standards.
|
# ? Aug 11, 2020 21:30 |
|
Yeah, that seems right. I think it was a cool card that just didn't have a ton of value at the time, but the Geforce2 was a lot better. I may be remembering people happy they waited and THEN making fun of the scrubs who got the 256 vs hating on the card immediately.
|
# ? Aug 11, 2020 21:40 |
|
BIG HEADLINE posted:What worries me is that we'll get a release with SKUs that have 10-12GB of frame buffer, and then the "Super" refreshes will double it. Surely not, GDDR is pretty expensive right? Edit: I find one place claiming it costs $22 for 8GB but it's more like $10 a GB from Micron which sounds believable. Carecat fucked around with this message at 22:09 on Aug 11, 2020 |
# ? Aug 11, 2020 22:02 |
|
I was stuck on a Prescott + FX5200 system for seven years. Some games had very low presets that disabled shaders so I could get a playable double digit frame rate!
|
# ? Aug 11, 2020 22:05 |
|
I can’t believe the FX5200 didn’t even have a fan in a lot of configurations, just a big ol’ finned heatsink.
|
# ? Aug 11, 2020 22:06 |
|
Carecat posted:Surely not, GDDR is pretty expensive right? Be careful you're looking at GB vs Gb. Most of the pricing tables I've seen are in Gb, or 1/8 a GB. ~$20/8Gb would be expensive, but not impossible for GDDR6. $22/8GB is cheaper than GDDR5 pricing. DrDork fucked around with this message at 22:33 on Aug 11, 2020 |
# ? Aug 11, 2020 22:26 |
|
I grew up with lovely video cards - Trident, Cirrus Logic cards and the S3 Virge. The odd thing is there’s a gap in my memory between the Virge and the first card I bought with my own money when I started working (660 Ti). I built several PCs for gaming in college (2004-2008) but don’t remember the parts I used - anyone remember what was cheap and mainstream then? I should probably dig through my email to see if I can find order receipts.
|
# ? Aug 11, 2020 23:52 |
|
shrike82 posted:I grew up with lovely video cards - Trident, Cirrus Logic cards and the S3 Virge. The odd thing is there’s a gap in my memory between the Virge and the first card I bought with my own money when I started working (660 Ti). I built several PCs for gaming in college (2004-2008) but don’t remember the parts I used - anyone remember what was cheap and mainstream then? Radeon 9700 wasn’t super cheap, except by today’s standards, but it’s the card I most remember from that rough window.
|
# ? Aug 12, 2020 00:17 |
|
My mother's work gave away their old work PCs, so I had some old Win 98 PC that I played Tie Fighter and X-Wing Alliance on. No idea of it's specs, but it couldn't handle a burned copy of Quake 2 my brother's friend gave to him. Poor thing booted the game and died.
|
# ? Aug 12, 2020 00:34 |
|
The 9700 was a 2002 era card, but was definitely a great buy. Probably the first GPU that was still relevant years after its release. The Geforce 7800 GT came out in 2005 and IIRC Nvidia was pretty dominant with the 7000 and 8000 series until AMD launched TeraScale with the 4000 series in 2008.
|
# ? Aug 12, 2020 00:35 |
|
shrike82 posted:I grew up with lovely video cards - Trident, Cirrus Logic cards and the S3 Virge. The odd thing is there’s a gap in my memory between the Virge and the first card I bought with my own money when I started working (660 Ti). I built several PCs for gaming in college (2004-2008) but don’t remember the parts I used - anyone remember what was cheap and mainstream then? I did that recently and had forgotten a few cards. My primary computer only cause at some point i started getting... more... and more computers. Paradise ISA S3 Virge DX Voodoo Rush Voodoo Banshee i740+Voodoo 2 Geforce 256 SDR Radeon 8500 LE Radeon 9600 Pro Geforce 6600 GT Radeon 4850 Radeon 7770 Ghz Edition Geforce GTX 960 Geforce GTX 1060 6Gb Geforce RTX 2080
|
# ? Aug 12, 2020 00:50 |
|
I missed out on the entire 3DFx series something that bothered me quite a bit. Used to fantasize about having a Voodoo 2 or Riva TNT2. I started a 2MB ATI Rage II built into the mobo. I jumped from that to a P4 with the Geforce 2 MX400. Then: ATI Radeon X800 ATI Radeon 5770XT GTX 1070 My next card is definitely gonna be a 3080TI I'm preordering with EVGA if that's even possible.
|
# ? Aug 12, 2020 01:29 |
|
As far as cards in computers I personally owned goes: NVidia Vanta (lol) NVidia FX5200 (lmao) ATi Radeon 9800 “Pro” (now we’re talking) — 2X NVidia 7800 GTX in SLI —- GeForce 9600 GT Radeon HD5750 — Whatever garbage is in an Alienware Alpha — 1660 Ti in my laptop and 2070 Super in my desktop.
|
# ? Aug 12, 2020 01:41 |
|
My first real 3D accelerator was a Voodoo2 8MB, folloed by: Voodoo3 3000 Geforce 3 5900 Ultra 6800 GT AGP with ultra bios 7950 GT 8800 GT 680 980 currently on a 1080
|
# ? Aug 12, 2020 02:13 |
|
Voodoo 2 GeForce MX440 Radeon 9600XT (first time I ever bought a GPU just for one game, HL2) Radeon 4890 GeForce 560 GeForce 970 GeForce 1080Ti Next up is a 3080Ti just for Cyberpunk 2077. I swear there was a card between the 9600XT and the 4890 but I'm drawing a blank right now. The first 2 cards were in family computers that I upgraded under careful guidance of my parents who knew nothing about computers with the 9600XT being in my first computer build as a teen.
|
# ? Aug 12, 2020 02:20 |
|
After slumming it for a long time before buying an 8800GT i've been buying a '70' Nvidia card or equivalent, skipping one generation, then buying the next '70' Nvidia card. let's all take care of eachother and our gpus.
|
# ? Aug 12, 2020 02:36 |
|
Normally I would skip a generation but even with DLSS, RT is demanding enough that I’m probably bumping up to a 30 series and maybe even doing the 80 instead of the 70. I haven’t whaled out on graphics like that in 15 years.
|
# ? Aug 12, 2020 02:38 |
|
LRADIKAL posted:It has more to do with software availability and optimization and integration of said software with the available hardware. i.e. if all the best producers with the most money buy macs and use a particular piece of mac software, and the best hardware add-ons are mac compatible, then you end up with the best production tool chain in spite of potentially "worse" hardware. This is spot on. The "Macs are better for graphics" era definitely existed, but it was in the late 80s to the mid 90s, and it was never really a gaming or performance thing. PC graphics hardware was a mess of incompatible standards with weird performance and feature gaps, and the limited Apple hardware set made for a comparatively easy and stable target. Photoshop and PageMaker (now InDesign) started as Mac-only products, and even after the Windows ports came out, Mac users were first-class citizens lording it over the Windows folks for quite a while. Mac OS's font handling was also way better than anything you'd get on DOS or Windows for a long, long time - not a big deal for most users, but essential for anybody trying to lay a page out to exact picas and points. If you were serious about any kind of print work, a Mac was absolutely necessary. By the time consumer 3D cards started to become commonplace in the late 1990s and early 2000s, though, Apple's hardware was nothing special, and the pro graphics situation on Windows made it to close-enough feature parity. Windows' own font handling stayed garbage for a long time, but any applications for people who cared included their own rendering engine and solved the problems themselves. At that point Apple was mostly coasting on the momentum of designers and others who'd learned to work on their software and didn't want to change.
|
# ? Aug 12, 2020 02:58 |
|
Cygni posted:I did that recently and had forgotten a few cards. My primary computer only cause at some point i started getting... more... and more computers. I love it every time the post your graphics cards game comes up!! ATI 3D Rage Pro Geforce 2 GTS 32mb Radeon 9800 Pro Geforce 6600GT Radeon X800GTO2 (flashed for 16 pipes) Radeon X1900XT Geforce GTX295 Geforce GTX 580 Radeon HD7870 GTX1080 GTX1060 (laptop)
|
# ? Aug 12, 2020 03:57 |
|
I forgot some of the huge gaps I had in my PC ownership. Riva 128ZX GeForce 2 MX400 GeForce FX 5600XT Radeon X800 GTO Radeon HD 4870 Radeon HD 6870 GeForce GTX 760 GeForce GTX 960m GeForce GTX 1060 GeForce RTX 2070
|
# ? Aug 12, 2020 04:06 |
|
The first computer I personally bought was a Pentium II running Windows 98, and I honestly can't remember the videocards I had in that one. I do remember it started with a pass-through card, and I upgraded it, but no recollection of the cards involved. Then I got a Pentium 4 that I think started with a GeForce 2MX, then I went to a Radeon 9600 Pro AIW and was good for years. After that my memory gets a little better (this is my main desktops and not secondary/project boxes): GeForce 6800 GT GeForce 9500 GT GeForce 9800 GTX GeForce GTX 260 GeForce GTX 460 GeForce GTX 660 GeForce GTX 1060 6GB GeForce RTX 2070 Super I've owned and used a bunch of other, lesser cards but not in my main machine (including a PCI FX 5200, which I still have).
|
# ? Aug 12, 2020 04:09 |
|
This may explain my ignorance of 20th-century cards: Radeon 9800 Pro (this was in a prebuilt family PC but my parents let me do the shopping) GeForce 8800 GTS 320MB in my first personal PC Radeon HD 6770 (lol) which I only bought for Skyrim GeForce GTX 1080 which I've been stuck with ever since, thanks Nvidia
|
# ? Aug 12, 2020 04:18 |
|
MikeC posted:While I am not a technical guru, I have read and heard from others that while AMD cannot use image reconstruction in the form of DLSS, there are other methods of image reconstruction available that AMD could utilize for a DLSS-like feature that does not require the use of tensor cores found on the Nvidia lineup. It might not be as good but it may be another case, like power efficiency, where 'good enough ' will get the job done with respect to matching features. Maybe AMD can implement something like DLSS 1.9 where it was running the reconstruction on the shader cores rather than using the tensors. I don't know how much speedup it had relative to 2.0, though, and it was only ever implemented in Control. It probably won't ever be as fast as having dedicated tensors but it might be less impactful on AMD cards like Vega which are typically bottlenecked by the fixed function parts of the pipeline long before they hit the shaders. A pipeline bottleneck means that shader processing is "free" in a sense, as long as it doesn't hit memory or other shared resources very much. This is a fun tradeoff you can make on GPUs - it is often more optimal to recompute some data rather than storing it and accessing it when you need it, because processing cycles are cheap compared to memory hits. Also, I think AMD has their own equivalent of tensor cores in CDNA now? I would expect those to make an appearance in RDNA 3, it's a bit too soon for RDNA 2 (and I'm sure Sony/MS would have bragged about it if it were in there) but AMD likely knew NVIDIA was doing something with the tensor cores a year or two before it was public, and it's been almost 2 years since NVIDIA publicly announced the concept. Radeon Image Sharpening is not anything close to DLSS 2.0 unfortunately, no matter how much people want it to be. It's basically just a sharpening filter and that has pretty well-understood benefits and drawbacks. In particular it tends to introduce ringing artifacts around high-contrast areas, some people perceive this as "extra detail" but it's actually glitches caused by the sharpening, it's not in the actual game itself, it's like punching the "sharpening" slider to the max in witcher 3, the game just crawls with artifacts. The problem is most people suck at critical analysis of images (and video/audio/etc), as we saw with the whole "radeon has better colors!" meme and will happily insist it's better. But on the whole - like GSync, this is an area where NVIDIA has pushed the state of the art and caught everybody else flat-footed. It'll take some time to copy their work in a way that evades patents/etc.
|
# ? Aug 12, 2020 05:01 |
|
The craziest thing to me (besides the fact that DLSS2 is better than native, which still amazes me) is that Nvidia's ambitious vision for gaming GPUs is coming true all at once. Nvidia basically sacrificed Turing to make a gigantic leap ahead. It seemed like lunacy at the time, but goddamn if it doesn't look like 4D chess in 2020. The final unknown piece of the Ampere puzzle is RTX. If the rumors are true, and RTX is vastly more efficient in Ampere, that constitutes the completion of their grand scheme. The stage is set for Ampere to be something really special. And DLSS will help ensure that Ampere is viable long past the normal shelf lives of high-end cards as well. Besides the RTX question, it will also be interesting to see what skus are actually made on TSMC 7nm. That's just icing on the cake, but let's hope at least the 3080Ti/3080 make it to market on their fab. I sound like such a fanboy and maybe I am at this point but I have never been so excited for a GPU launch. I've been through them all. I was lucky enough to be raised in silicon valley, my father was an engineer, so I always had the hotness from the beginning (through no merit of my own). So like many of you I've been around the block a few times with GPU launches. That being said, It seems like so many loose ends are coming together at the same time in a way that is going to produce a spectacular product. I can't wait for the 31st.
|
# ? Aug 12, 2020 05:43 |
|
|
# ? Apr 26, 2024 10:06 |
|
lol
|
# ? Aug 12, 2020 05:48 |