Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
mayodreams
Jul 4, 2003


Hello darkness,
my old friend

movax posted:

Blast from the past, a Nvidia Riva 128 die


Thanks because that owned.

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
AnandTech will be doing a Q&A with a GPGPU sexpert, so submit some questions.

And by "sexpert," I mean he was a founder of Aegia (of PhysX fame), became one of Nvidia's top CUDA guys, and is now with AMD spearheading their hetereogeneous systems architecture, i.e. the seamless integration of highly-parallel cores (i.e. GPUs) with complex serial cores (i.e. CPUs).

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

AnandTech will be doing a Q&A with a GPGPU sexpert, so submit some questions.

And by "sexpert," I mean he was a founder of Aegia (of PhysX fame), became one of Nvidia's top CUDA guys, and is now with AMD spearheading their hetereogeneous systems architecture, i.e. the seamless integration of highly-parallel cores (i.e. GPUs) with complex serial cores (i.e. CPUs).

That guy has the job that parallel, cooler universe my dad has, and that DadP,CU rules, and got me into computers for real at a younger age and now I'm on my way to being cool as heck and knowing lots about microarchitectures and eventually being technology coordinator for a major computing firm.

But here in the real world I just look at that dude, who isn't my DadP,CU and his CV makes me go :allears: because seriously, what cool tech he's been a part of. I use work he was integral to all the time, and probably will continue to do so until a major computing paradigm shift.

Star War Sex Parrot
Oct 2, 2003

Factory Factory posted:

AnandTech will be doing a Q&A with a GPGPU sexpert, so submit some questions.

And by "sexpert," I mean he was a founder of Aegia (of PhysX fame), became one of Nvidia's top CUDA guys, and is now with AMD spearheading their hetereogeneous systems architecture, i.e. the seamless integration of highly-parallel cores (i.e. GPUs) with complex serial cores (i.e. CPUs).
drat I remember when that dude was just sitting at a bare table with just a laptop showing off liquid simulations at some lovely Pepcom show that coincided with E3 because he couldn't actually get into E3. He was saying "we think we can do this with hardware and game developers will like it."

KillHour
Oct 28, 2007


I'm not really seeing any recent games that use PhysX. Have there been any high profile ones that use it for more than incidental effects in the last year or 2?

Nomenklatura
Dec 4, 2002

If Canada is to survive, it can only survive in mutual respect and in love for one another.
Since this is the Videocard discussion thread, thought I'd ask here: anybody found a fix/workaround for that damned stupid HDMI audio bug that was introduced with the latest (12.4) Catalyst drivers? Only thing that googling turns up is unplugging the HDMI cord and plugging it back in again. If you don't do that, the HDMI audio just won't start up.

(How the hell does something like that get through testing? Some weird edge case with Skyrim, sure, but it's not like HDMI is exactly rare these day.)

Double Punctuation
Dec 30, 2009

Ships were made for sinking;
Whiskey made for drinking;
If we were made of cellophane
We'd all get stinking drunk much faster!
Catalyst drivers have never handled audio very well. I had a card hooked into an AV receiver, and if I ever switched between the cable box and the computer, the system would think the receiver was plugged in when it wasn't and vice versa. I'd have to disable and re-enable both the graphics and HDMI audio drivers in Device Manager to get it to work again.

I guess you could make a batch file that did that using DevCon and run that as administrator as needed.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

KillHour posted:

I'm not really seeing any recent games that use PhysX. Have there been any high profile ones that use it for more than incidental effects in the last year or 2?

Hrm. Off the top of my head - Mass Effect games, Mirror's Edge, Metro 2033, Batman:AA, Batman:AC, Battlefield 3. Some upcoming games - Metro 2034: Last Light, Borderlands 2.

Best example for wow-factor is Metro 2033, where it's used for real-time cloth simulation along with additional volumetric light processing - bullet holes in cloth with real-time volumetric shafts of light beaming out, it looks pre-rendered but nope.

RAGE had fantastic hardware accelerated megatexture streaming on nVidia cards as well though that's not really PhysX, just compute that was optimized heavily for Fermi.

The idea of compute tasks on the card is a great one. Everyone, including me, thought it was stupid back when "dedicated physics cards" were getting tossed around. I mean, hey, we had DUAL CORE processors, the second core could handle physics just fine. Turns out GPU architecture is way better at physics stuff, who knew :v:

Doesn't need to be or stay proprietary, though. A lesson I feel nVidia is learning with time, even if they do have barrels of money better to be generalist and have their cards offer perks than to be so restrictive that they don't end up getting used broadly. See: original proprietary MLAA vs. "oh yeah? how 'bout now? :smuggo:" FXAA.

Star War Sex Parrot
Oct 2, 2003

Agreed posted:

Hrm. Off the top of my head - Mass Effect games, Mirror's Edge, Metro 2033, Batman:AA, Batman:AC, Battlefield 3. Some upcoming games - Metro 2034: Last Light, Borderlands 2.
Mass Effect and Battlefield 3 did not have hardware PhysX, at least according to NVIDIA.

edit: Also the Batman games have used it to the best effect, in my experience. The Scarecrow sequences in AA were night and day.

https://www.youtube.com/watch?v=6GyKCM-Bpuw&hd=1

Star War Sex Parrot fucked around with this message at 06:50 on May 15, 2012

eggyolk
Nov 8, 2007


PhysX seems to only enable the cutting edge WDC feature (wafting-drop-cloth) in a few select games. See here and here. Pretty deal breaking IMO.

Josh Lyman
May 24, 2009


eggyolk posted:

PhysX seems to only enable the cutting edge WDC feature (wafting-drop-cloth) in a few select games. See here and here. Pretty deal breaking IMO.
You mean that it's really not that useful and nobody should go with a GeForce over a similar Radeon because of PhysX.

BLOWTAKKKS
Feb 14, 2008

I'm interested in getting a GTX 690 because I have no self control, and I want to give my brother one of my 570s since he's stuck on a GTS 250. Are ASUS or EVGA going to come out with anything other than reference cards? I'm considering buying one, but I want to make sure that a slightly better version isn't on the way.

kuddles
Jul 16, 2006

Like a fist wrapped in blood...
EVGA usually doesn't come out with a different version of 2-in-1 GPUs other than one pre-designed for water cooling, but ASUS has been coming out with third party cooling solutions on almost every card recently so you might want to wait a bit and see if it's that important to you.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Star War Sex Parrot posted:

Mass Effect and Battlefield 3 did not have hardware PhysX, at least according to NVIDIA.

edit: Also the Batman games have used it to the best effect, in my experience. The Scarecrow sequences in AA were night and day.

https://www.youtube.com/watch?v=6GyKCM-Bpuw&hd=1

I think everyone who does use PhysX has their favorite, but it's just another physics engine at the end of the day - prettier than most, if you're using an expensive nVidia card and can turn it on without eating framerate poo poo. Havok, other licensed engines and custom physics engines do fine, but GPU acceleration allows for some remarkable stuff. Bringing us back to the original topic, which was "that guy had a cool impact on technology, bringing first PhysX, then CUDA, and now heading AMD's push for GPGPU+CPU integration" - that's a bit more significant than "pffft who uses PhysX anyway?

Edit: SWSP mentioned that some titles were not GPU accelerated and I guess that's a good point while we're on the PhysX tangent. Software PhysX looks good too, it's just more in a "this could probably be accomplished with other premade physics engines" kind of good, not exclusive and really knock-out impressive good. Some very neat stuff, but nothing with the WOW!-factor of Batman:AA/Batman:AC and their massive integration of GPU accelerated PhysX to do stuff that would slow the CPU to a crawl.

Agreed fucked around with this message at 15:37 on May 15, 2012

movax
Aug 30, 2008

I'm glad his tech gets to live on somewhat inside of Nvidia's GPUs though, versus just dying out completely when the market proved not enough people were going to buy a dedicated PhysX add-in card for their games.

And yeah, Batman is the only game I (quickly) remember having some cool PhysX environmental effects and such (and the duh-obvious cape cloth). I think both of those titles received heavy assistance from Nvidia though.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Mirror's Edge, too. Cloth, shattering glass, and partpickles.

https://www.youtube.com/watch?v=w0xRJt8rcmY

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

movax posted:

I'm glad his tech gets to live on somewhat inside of Nvidia's GPUs though, versus just dying out completely when the market proved not enough people were going to buy a dedicated PhysX add-in card for their games.

And yeah, Batman is the only game I (quickly) remember having some cool PhysX environmental effects and such (and the duh-obvious cape cloth). I think both of those titles received heavy assistance from Nvidia though.

The cape effects actually aren't PhysX, as I recall, because they wanted everyone (consoles included) to see them.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Dogen posted:

The cape effects actually aren't PhysX, as I recall, because they wanted everyone (consoles included) to see them.

PhysX is just another physics API. The cape is PhysX - it's just not GPU accelerated PhysX. Lots of games are PhysX on the processor without any GPU acceleration, it's like Havok or other pre-made physics engines, just with some really cool effects possible with GPU acceleration if the devs get serious with it (with nVidia's tech team helping, usually - I recall they were deeply involved in the Batman games' development tech-wise).

ijyt
Apr 10, 2012

Dogen posted:

The cape effects actually aren't PhysX, as I recall, because they wanted everyone (consoles included) to see them.

They're enhanced and much more natural with PhysX though, playing on my 5850 the cape is a lot more rigid.

Klyith
Aug 3, 2007

GBS Pledge Week

movax posted:

As a product of the late 90s, the NV3 packed some bitchin' features.
One feature you missed: support for 24/32bit color in 3d. The main competition Voodoo2 could only render at 16bit. At the time I had a completely stacked gaming computer (bought for me by my grandmother for "important college work" :raise:), and I had both of them. In the beginning I was a total quake-head and used the voodoo2 all the time for glide... But then a little game called Homeworld came out and showed me that things like color, art, and immersion really mattered in games. I think that game was pretty much the first one that was designed for -- with it's fantastic brightly colored gradient backgrounds and layered transparent effects -- full 32bit color, and only fully enjoyable that way.

(Also does anyone else remember the Riva128 vs Voodoo2 flamewars? So epic. So dumb.)

feld
Feb 11, 2008

Out of nowhere its.....

Feldman

text editor posted:

Mostly saving this spot for some spergy unixposting, but as far as graphics go, but for substance for now I'll just add in that the level of graphics support on Linux and BSD has changed a bit due to the requirements of Kernel Mode Settings by newer Intel, AMD, and Nvidia caards.

You're confusing Nvidia with Nouveau. Nouveau is going the KMS/GEM/DRM route, but Nvidia's binary blob which is superior in every way except the fact that it's not open source will probably never use KMS.

movax
Aug 30, 2008

Klyith posted:

One feature you missed: support for 24/32bit color in 3d. The main competition Voodoo2 could only render at 16bit. At the time I had a completely stacked gaming computer (bought for me by my grandmother for "important college work" :raise:), and I had both of them. In the beginning I was a total quake-head and used the voodoo2 all the time for glide... But then a little game called Homeworld came out and showed me that things like color, art, and immersion really mattered in games. I think that game was pretty much the first one that was designed for -- with it's fantastic brightly colored gradient backgrounds and layered transparent effects -- full 32bit color, and only fully enjoyable that way.

(Also does anyone else remember the Riva128 vs Voodoo2 flamewars? So epic. So dumb.)

Hm, for some reason I thought I read that the Riva 128 was still limited to 16-bit color depth in 3D operations (I think Wiki says that), but if that isn't the case I will totally update that post!

Boten Anna
Feb 22, 2010

Klyith posted:

One feature you missed: support for 24/32bit color in 3d. The main competition Voodoo2 could only render at 16bit. At the time I had a completely stacked gaming computer (bought for me by my grandmother for "important college work" :raise:), and I had both of them. In the beginning I was a total quake-head and used the voodoo2 all the time for glide... But then a little game called Homeworld came out and showed me that things like color, art, and immersion really mattered in games. I think that game was pretty much the first one that was designed for -- with it's fantastic brightly colored gradient backgrounds and layered transparent effects -- full 32bit color, and only fully enjoyable that way.

(Also does anyone else remember the Riva128 vs Voodoo2 flamewars? So epic. So dumb.)

Since this was the 90s, by both of them do you mean you would actually open up your computer and swap them out? Or since this was the 90s, do you mean they both just sat on the uniform AGP/PCI slots and you just plugged in the one you wanted to use at the time?

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Boten Anna posted:

Since this was the 90s, by both of them do you mean you would actually open up your computer and swap them out? Or since this was the 90s, do you mean they both just sat on the uniform AGP/PCI slots and you just plugged in the one you wanted to use at the time?

The Voodoo and Voodoo2 didn't replace the other graphics card in the system; they were 3D-only devices that sat between your "main" video card and the monitor with a little loopback cable (this is why, if you look at them, they have two VGA ports: one in, one out). When you used the Voodoo card, it made a little clunk thanks to some mechanical relays, disconnected your other video card, and took over the VGA output. When you stopped using it, it went "clunk" again, and turned into a dumb bunch of wires that just passed the other video card's signal through.

Klyith
Aug 3, 2007

GBS Pledge Week

movax posted:

Hm, for some reason I thought I read that the Riva 128 was still limited to 16-bit color depth in 3D operations (I think Wiki says that), but if that isn't the case I will totally update that post!
Oops, I was wrong! :sweatdrop: I had a first-gen TNT. I got the computer in late '98 just after it came out. I now remember that the computer, ordered from Gateway's new custom built computers from an internet store!, was like 2 months late, and during that delay the TNT came out and got put in my machine even though it was ordered with a riva128.

Man, nostalgia. I never had good computers when I was growing up despite being desperately obsessed with them. No c64, no amiga, a 286 long into the 486 and early pentium days, etc. Getting that college computer was like the apotheosis of all my boyhood computer lust.

Boten Anna posted:

Since this was the 90s, by both of them do you mean you would actually open up your computer and swap them out? Or since this was the 90s, do you mean they both just sat on the uniform AGP/PCI slots and you just plugged in the one you wanted to use at the time?
Nope. Since this was the late 90s, the TNT plugged into the AGP slot, sent video out through VGA into the Voodoo2's passthrough port, and thence out again to the monitor. If you wanted SLI you had to add a third pass to the daisychain! In those days SLI was really about (vga) scan lines!

movax
Aug 30, 2008

Klyith posted:

Oops, I was wrong! :sweatdrop: I had a first-gen TNT. I got the computer in late '98 just after it came out. I now remember that the computer, ordered from Gateway's new custom built computers from an internet store!, was like 2 months late, and during that delay the TNT came out and got put in my machine even though it was ordered with a riva128.

Man, nostalgia. I never had good computers when I was growing up despite being desperately obsessed with them. No c64, no amiga, a 286 long into the 486 and early pentium days, etc. Getting that college computer was like the apotheosis of all my boyhood computer lust.

Heh, no problem. I knew it was a feature (touted) of the TNT which was my first GPU as well (in a Dell Dimension R400). I got my first taste of driver issues when Rogue Squadron 3D came-out, and the drivers at the time resulted only in the shadows of the ships being drawn. :downs: STB Velocity 4400 supremacy.

I remember reading a good, long piece on the fall/decline of 3dfx, I will try to dig that up again.

movax fucked around with this message at 21:31 on May 15, 2012

Fuzz
Jun 2, 2003

Avatar brought to you by the TG Sanity fund
Regarding SLI... how big a difference does brand, clockspeed, and memory speed make a difference? I ask because I have an EVGA 1GHz GTX 560Ti (model 01G-P3-1561-AR, posting from my phone so I can't link it) that's got a mild overclock to 850MHz core, 1700MHz shader, 4104MHz memory at the moment, but TigerDirect has an insane sale on a Zotac card (ZT-50304-10M), same chip and memory amount, but the core and shader clocks are faster, memory is slightly slower. Would they not play nice together in SLI and should I go with a matched card, or would they be okay? And how much of a performance hit (if any) would I take vs 2 of the EVGA cards together?
OP doesn't have specifics.

movax
Aug 30, 2008

Fuzz posted:

Regarding SLI... how big a difference does brand, clockspeed, and memory speed make a difference? I ask because I have an EVGA 1GHz GTX 560Ti (model 01G-P3-1561-AR, posting from my phone so I can't link it) that's got a mild overclock to 850MHz core, 1700MHz shader, 4104MHz memory at the moment, but TigerDirect has an insane sale on a Zotac card (ZT-50304-10M), same chip and memory amount, but the core and shader clocks are faster, memory is slightly slower. Would they not play nice together in SLI and should I go with a matched card, or would they be okay? And how much of a performance hit (if any) would I take vs 2 of the EVGA cards together?
OP doesn't have specifics.

Most importantly, they have the same chipset, so they'll work in SLI. Next, they have the same amount of VRAM, so none will be wasted. I'm ~95% sure they'll run at the slower of the two clockspeeds, but nothing is stopping you from cranking up the Zotac clocks to see if they can match up with your eVGA card.

Boten Anna
Feb 22, 2010

Woah, that passthrough stuff is amazing, the wonders of the 90s that I didn't get to experience :monocle:

I remember my first graphics card was a diamond viper, and it was totally badass and stuff. I also remember playing with UltraHLE and having to use glide wrappers to get it to work and not really knowing what they were, entirely. Then having my mom excited to hear I got the new zelda working (only video game she ever liked) and having to leave within a minute because it was making her nauseous, haha.

Klyith
Aug 3, 2007

GBS Pledge Week
So if we're taking a trip down memory lane to the GPU technology of the 90s, lets make a quick stop to mention S3. At the same time as nvidia was making the Riva128, S3 made the ViRGE, which was groundbreaking in it's own way. The virge was another combined 2d/3d accelerator, but it was a bit... weak at that 3d thing. In fact it was so bad, that as soon at you asked it to do anything more complex than render a single unfiltered texture on a poly, its performance dropped off a cliff. The average cpu of the time using software rendering could play quake better than a virge; it was mockingly called a "video de-accelerator" on usenet and the early internet.

However, the silver lining to the dark cloud of S3 failure was their mighty efforts to find *some* way to make their followup, the Savage 3D, perform faster. Nothing inspires engineers like having a bunch of nerds mock your efforts for years. So they invented two things that are still in use today: 1-cycle trilinear filtering and a little thing called S3TC, which was licensed by Microsoft to use in DirectX as DXTC: DirectX Texture Compression.

It wasn't enough at all. The Savage 3D had average performance but couldn't be made cheaply due to horrendous yield. (Wikipedia sez: "S3's yield problems forced Hercules to hand pick usable chips from the silicon wafers.") The Savage 4 fixed the yield problems but didn't get much faster, and was a cheapo second fiddle to the TNT2 and ATI Rage. The Savage2000 was kinda better, with hardware performance nearly equal to the new GeForce, but drivers so atrocious that most games were unplayable.


S3 quit the video chip market and sold everything to VIA, then merged with Diamond Multimedia, with whom they had worked together on a cool little product: the Rio PMP, the first portable mp3 player. I had a PMP500 (successor model), with 64mb of internal flash plus a 64mb MMC card. It's impossible to explain how baller it was to walk around with that when everyone else still had big skipping CD walkmen. Later they made the first time-shifting digital TV recorder, the ReplayTV. Both of these products caused them to get sued, by the RIAA and MPAA respectively, making them waste all their time and money fighting 2 huge lawsuits, and eventually killed them even though they won. Which is why the media companies to this day still sue any technology they hate and fear, and why Apple rules the world instead of S3.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Klyith posted:

So they invented two things that are still in use today: 1-cycle trilinear filtering and a little thing called S3TC, which was licensed by Microsoft to use in DirectX as DXTC: DirectX Texture Compression.
I still remember installing the S3TC texture pack that came with Unreal Tournament (UT99) GOTY edition and being completely awed by the amazing, high-resolution graphics.

Fuzz
Jun 2, 2003

Avatar brought to you by the TG Sanity fund
Nevermind on my previous question, turns out my second PCIe 2.0 slot is only 4x, not 16, so it can only support Crossfire, not SLI. :smith:
Oh well, not a huge deal, not like my performance as is is bad.

movax
Aug 30, 2008

Fuzz posted:

Nevermind on my previous question, turns out my second PCIe 2.0 slot is only 4x, not 16, so it can only support Crossfire, not SLI. :smith:
Oh well, not a huge deal, not like my performance as is is bad.

What's your motherboard model? No x8/x8 bifurcation?

Fuzz
Jun 2, 2003

Avatar brought to you by the TG Sanity fund

movax posted:

What's your motherboard model? No x8/x8 bifurcation?

Gigabyte GA-P67A-D3-B3... Sandybridge setup, seems to only list 16x/4x. Only got it a year ago, so going that far to upgrade is out of the question.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Gigabyte strikes again :argh:

movax
Aug 30, 2008

Fuzz posted:

Gigabyte GA-P67A-D3-B3... Sandybridge setup, seems to only list 16x/4x. Only got it a year ago, so going that far to upgrade is out of the question.

Ahh yeah, poo poo, you're out of luck, sorry. Only 2 PCIe x16 slots and all 16 lanes from the CPU are run to one slot, and the other gets a x4 from the PCH. :(

Drighton
Nov 30, 2005

Dogen posted:

Gigabyte strikes again :argh:

Wait, what? I thought they had decent mobos? I switched from buying ASUS mobos after plenty of compatibility and driver issues with the last two I owned.

4 Day Weekend
Jan 16, 2009
I was thinking about getting a 670/680, but since I have a 570 already I'm now thinking of grabbing another and SLI-ing them. What's the requirement for having SLI compatible cards? I've got a Gigabyte GTX570 (link) and I was looking at getting an EVGA one (link).

Same mem, same model and clock speeds close enough for me to underclock the GB one.

I've also got a board with 2 PCI-E 3.0 16x slots (asrock z77 extreme4) and a decent PSU. Anything else I need/should know?

movax
Aug 30, 2008

4 Day Weekend posted:

I was thinking about getting a 670/680, but since I have a 570 already I'm now thinking of grabbing another and SLI-ing them. What's the requirement for having SLI compatible cards? I've got a Gigabyte GTX570 (link) and I was looking at getting an EVGA one (link).

Same mem, same model and clock speeds close enough for me to underclock the GB one.

I've also got a board with 2 PCI-E 3.0 16x slots (asrock z77 extreme4) and a decent PSU. Anything else I need/should know?

The OP admittedly needs some organizational banners and such to clarify, but I don't blame you for missing that part, as its one paragraph in like 15k words.

Anyway, same GPU is absolutely required. Same memory is good because you won't waste any. I don't know for sure what happens with regard to clock speeds but it's safe to assume that the faster card will downclock to match its slower brother.

It may be worth investigating 2x 570 performance vs. single 670 performance, and seeing how much you can sell your 570 for.

Adbot
ADBOT LOVES YOU

td4guy
Jun 13, 2005

I always hated that guy.



If anyone's waiting on the dual-fan overclocked EVGA SC Signature 2 GTX 680, it's apparently still coming, and more information will be released on it at the end of the month. (source)

No idea why you'd want it over the ASUS version aside from brand loyalty though.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply