|
Cygni posted:There was also a whole industry around modding GeForce cards into Quadros and Radeons into FireGLs, with some mods being as easy as sticking a piece of tape on the card. Nvidia and AMD wised up to those tricks. I remember doing this with my ti4600. It gave me the ability to use 2-4x antialiasing at nearly no performance cost cost. It was pure magic to me.
|
![]() |
|
![]()
|
# ? Nov 28, 2023 10:36 |
|
The early 2000s absolutely were a magical time for those kind of "hardware hacks". I remember there being complaints about, for example, the GeForce 3 (the first discrete video card I ever bought) because it was "boring" and didn't have those kind of ~hax~. The video card manufacturers were still young as companies for the most part, and most weren't as aware of just how resourceful end users could be, especially with the aid of the nascent Internet. (AMD and ATI should've known a little better given their institutional experience, though.)
|
![]() |
|
SpaceDrake posted:The early 2000s absolutely were a magical time for those kind of "hardware hacks". I remember there being complaints about, for example, the GeForce 3 (the first discrete video card I ever bought) because it was "boring" and didn't have those kind of ~hax~. The video card manufacturers were still young as companies for the most part, and most weren't as aware of just how resourceful end users could be, especially with the aid of the nascent Internet. (AMD and ATI should've known a little better given their institutional experience, though.) I think sometime around the nvidia 6000 series they started doing laser cutting to physically block off those extra features. We had a 6200 AGP on the family computer I remember being bummed out about when i found it was a revision they did that to. Kind of mean spirited imo, the cores of wrath grew heavy on the die that day Rigged Death Trap fucked around with this message at 09:34 on Nov 28, 2023 |
![]() |
|
Rigged Death Trap posted:the cores of wrath lmao
|
![]() |