Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
forbidden dialectics
Jul 26, 2005





AMD should sue Qualcomm for allowing them to make the worst business decision of the century. Even the judge would give them an "aww, you dumb kids!" kind of look before dismissing it.

Adbot
ADBOT LOVES YOU

RocketSurgeon
Mar 2, 2008

No Gravitas posted:

Isn't that connector a standard length? Or the screws? Or the VGA ports are a standard size? You can probably use proportions from a known quantity and calculate out exactly what you want.

e.g: If I know that EICP connector is 10 meters long, and it seems like it occupies half the card, then the card must be 20 meters. You can get very precise with ruler tools in photo editing programs.

gently caress, wish I would have thought of that. Thanks!

isndl
May 2, 2012
I WON A CONTEST IN TG AND ALL I GOT WAS THIS CUSTOM TITLE

Factory Factory posted:

This isn't patent trolling. According to Nvidia, it tried to work out licensing with Samsung for a long time before filing this suit. The GeForce 256 was the first totally integrated GPU. Unified shaders were a major shift in GPU design. Why shouldn't these things be patented? They're not even software techniques, but rather hard-goods inventions.

It looks to me like Samsung is just being caught in the crossfire due to being a major purchaser of mobile chips, honestly. Nvidia wants to hit Qualcomm and take a big chunk out of their wallet at the same time. Nvidia wouldn't need to work out a licensing agreement with Samsung as they aren't in the business of producing mobile graphics chips.

Mad_Lion
Jul 14, 2005

I'm still mad at NVidia for destroying 3DFX. I was a diehard and I actually ran a Voodoo 5 5500 dual GPU card (which basically competed with the original Geforece, despite coming out when the Gefore 2 debuted).

I played Unreal Tournament and Quake 3 just fine, thank you very much.

I'm AMD all the way every since. I have a stupid grudge, luckily, ATI and AMD haven't let me down yet.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Mad_Lion posted:

I'm still mad at NVidia for destroying 3DFX.

this is like me with steam because I bought the half-life retail pack and found out I couldn't come back to it to play online

then ten years later I started playing computer games and

Panty Saluter
Jan 17, 2004

Making learning fun!

spasticColon posted:

Why page 256? Is that number significant is some way?

its the maximum number of values represented in an 8 bit word v:v:v

Sidesaddle Cavalry posted:

this is like me with steam because I bought the half-life retail pack and found out I couldn't come back to it to play online

then ten years later I started playing computer games and

I bought the retail pack years before Steam and they still honored my keys. It ruled since I could finally stop carting those CDs around.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

isndl posted:

It looks to me like Samsung is just being caught in the crossfire due to being a major purchaser of mobile chips, honestly. Nvidia wants to hit Qualcomm and take a big chunk out of their wallet at the same time. Nvidia wouldn't need to work out a licensing agreement with Samsung as they aren't in the business of producing mobile graphics chips.

Not the IP, but they absolutely do produce chips with GPUs - their Exynos SoCs use Mali and PowerVR GPU blocks.e

Fame Douglas
Nov 20, 2013

by Fluffdaddy

Sidesaddle Cavalry posted:

this is like me with steam because I bought the half-life retail pack and found out I couldn't come back to it to play online

then ten years later I started playing computer games and

This is no one's fault but your own: You can redeem your Half Life key with Steam, Valve will even throw in all the add ons and some additional multiplayer titles/mods for free.

https://support.steampowered.com/kb_article.php?ref=7480-wusf-3601

isndl
May 2, 2012
I WON A CONTEST IN TG AND ALL I GOT WAS THIS CUSTOM TITLE

Factory Factory posted:

Not the IP, but they absolutely do produce chips with GPUs - their Exynos SoCs use Mali and PowerVR GPU blocks.e

Well, I guess that shows how much I pay attention to the whole SoC scene. Samsung does look like a more legitimate target in that case.

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



Regarding the AMD 285 cards - it's a pretty safe bet that a 285X will follow shortly, right?
I haven't seen any announcements yet, but it seems almost certain at this point.

Big Mackson
Sep 26, 2009
i am so happy i have a 290x for the freesyncs mmmmmmm delissshhhh i am so content ent ent ent.

(USER WAS PUT ON PROBATION FOR THIS POST)

1gnoirents
Jun 28, 2014

hello :)

Mad_Lion posted:

I'm still mad at NVidia for destroying 3DFX. I was a diehard and I actually ran a Voodoo 5 5500 dual GPU card (which basically competed with the original Geforece, despite coming out when the Gefore 2 debuted).

I played Unreal Tournament and Quake 3 just fine, thank you very much.

I'm AMD all the way every since. I have a stupid grudge, luckily, ATI and AMD haven't let me down yet.

Haha I remember voodoo cards.

I didn't know what happened to them so I read the wiki article and it seems pretty tame, I don't know what you're referring to. It implied that the company folded on itself, couldn't compete, filed for bankruptcy, and was bought out by nvidia. I'd understand if it was a healthy company and then bought out and torn down, but it seems they even offered free geforce replacements which seems almost nice more than anything.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

1gnoirents posted:

I didn't know what happened to them so I read the wiki article and it seems pretty tame, I don't know what you're referring to. It implied that the company folded on itself, couldn't compete, filed for bankruptcy, and was bought out by nvidia. I'd understand if it was a healthy company and then bought out and torn down, but it seems they even offered free geforce replacements which seems almost nice more than anything.

There was a big to-do about it because one of the ways NVidia managed to get ahead was (reportedly, anyhow) basically a massive amount of IP infringement on 3DFX's technologies, like SLI. When they started to go belly-up, NVidia bought them and adroitly avoided any risk of having to deal with a protracted court case.

The reality of the situation may or may not be a bit different, but that's how a lot of people looked at it at the time, and still feel NVidia somehow robbed us all of the glory that would have been further Voodoo cards. With external power bricks. Yeah.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

DrDork posted:

There was a big to-do about it because one of the ways NVidia managed to get ahead was (reportedly, anyhow) basically a massive amount of IP infringement on 3DFX's technologies, like SLI. When they started to go belly-up, NVidia bought them and adroitly avoided any risk of having to deal with a protracted court case.
3dfx died in 2000, long before NV had any sort of SLI support (introduced in NV4x in 2004, I think). 3dfx died because they were incredibly late to hardware T&L, thought Glide would become a permanent monopoly and didn't anticipate D3D/GL correctly, but most importantly because of the disastrous acquisition of STB and the attempt to become a vertically-integrated GPU supplier.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Professor Science posted:

3dfx died in 2000, long before NV had any sort of SLI support (introduced in NV4x in 2004, I think). 3dfx died because they were incredibly late to hardware T&L, thought Glide would become a permanent monopoly and didn't anticipate D3D/GL correctly, but most importantly because of the disastrous acquisition of STB and the attempt to become a vertically-integrated GPU supplier.
You're probably correct--all you have to do is look at the late model/not really released voodoo 5 series to see that they dun hosed up. At the time, though, there was a lot of fanboi angst and anger and general feelings that NVidia had been underhanded about the whole thing. In the end it's kinda irrelevant how exactly it happened; 3DFX wasn't going to survive anyhow.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
Had a VooDoo2 card whatup crew

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Aw yeah, VGA passthrough to Voodoo2 passthrough to MPEG decoder. All the color-keying VGA overlay. What's that, how do I take a screenshot? I don't.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
Had a Voodoo Banshee. It was sweet until it died. The SiS piece of poo poo replacement couldn't even run its own manufacturer-provided benchmark without lagging out, but I somehow put up with it until a friend eventually gave me a Voodoo3 out of pity.

I've never really used Nvidia cards either, but it has nothing to do with 3DFX :rip: but mostly that the first cards I happened to personally buy were from the 9000-series and then ATI replaced my in-house X850pro a week out of warranty when they really shouldn't have.
It ... kinda had UV paint all over it. Bought themselves a long-term customer though.

future ghost fucked around with this message at 06:20 on Sep 7, 2014

BurritoJustice
Oct 9, 2012

Is something wrong if my brand new 780ti is seeing drops into the teens while playing borderlands 2? It is only averaging around the 50s too. Brand new MSI 780ti, stock everything, i5 3570k. I'm at 1440p.

Edit: I ran unigine heaven and got 1521, searching around this is about 250 less than other people with 780ti's are getting.

BurritoJustice fucked around with this message at 08:43 on Sep 7, 2014

DaNzA
Sep 11, 2001

:D
Grimey Drawer

DrDork posted:

You're probably correct--all you have to do is look at the late model/not really released voodoo 5 series to see that they dun hosed up. At the time, though, there was a lot of fanboi angst and anger and general feelings that NVidia had been underhanded about the whole thing. In the end it's kinda irrelevant how exactly it happened; 3DFX wasn't going to survive anyhow.

Wait nvidia was the shady one? thought only 3dfx was trying to squeeze every out of everyone with their own proprietary glide api and shutting down people's glide emulation with attempt with lawsuit. They even cut off the manufacturers when they bought STB but turned out "made in mexico" wasnt that good.

But either way the main downfall was that they had lower fps in quake 3 :v:

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

BurritoJustice posted:

Is something wrong if my brand new 780ti is seeing drops into the teens while playing borderlands 2? It is only averaging around the 50s too. Brand new MSI 780ti, stock everything, i5 3570k. I'm at 1440p.

Edit: I ran unigine heaven and got 1521, searching around this is about 250 less than other people with 780ti's are getting.

I was playing borderlands 2 just fine on my 680, drops into the teens does seem weird. Have you tried the usual troubleshooting stuff? latest drivers, backing off on any overclocks, etc?

BurritoJustice
Oct 9, 2012

Latest drivers, stock clocks.

I've used DDU to remove the drivers and reinstalled them with a clean install. Still disappointingly low performance in the games I've tried.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

BurritoJustice posted:

Latest drivers, stock clocks.

I've used DDU to remove the drivers and reinstalled them with a clean install. Still disappointingly low performance in the games I've tried.

what settings are you using in Uniengine heaven? I'll replicate the same settings and see what result I get. Also what Nvidia Control panel settings.

BurritoJustice
Oct 9, 2012

The Lord Bude posted:

what settings are you using in Uniengine heaven? I'll replicate the same settings and see what result I get. Also what Nvidia Control panel settings.

The 1512 was done with 8x AA, 1920x1080, tessellation extreme. I haven't touched the control panel since clean installing.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

BurritoJustice posted:

The 1512 was done with 8x AA, 1920x1080, tessellation extreme. I haven't touched the control panel since clean installing.

I got 1630.

min fps 26, max 136, avg 64. I was also running 'ultra' quality level, you didn't specify which one you were using. I reset my control panel settings to default for the test.

I also have a 4770K though, which might have impacted the benchmark.

I've never seen borderlands 2 drop into the teens though, even with my old 680.

BurritoJustice
Oct 9, 2012

The Lord Bude posted:

I got 1630.

min fps 26, max 136, avg 64. I was also running 'ultra' quality level, you didn't specify which one you were using. I reset my control panel settings to default for the test.

I also have a 4770K though, which might have impacted the benchmark.

I've never seen borderlands 2 drop into the teens though, even with my old 680.

Thanks for your help. I went crazy and ordered a TC14PE, to try and see what I can wring out of this CPU.

EDIT: Can the voltage be changed on the stock BIOS? Afterburner allows an offset up to +75mV but the core voltage stays at 1.175 max no matter what.

BurritoJustice fucked around with this message at 11:16 on Sep 7, 2014

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

BurritoJustice posted:

Thanks for your help. I went crazy and ordered a TC14PE, to try and see what I can wring out of this CPU.

Would your performance issue be CPU related though? When I was playing borderlands on my gtx680 the cpu was an old Phenom IIx6 1090t.

BurritoJustice
Oct 9, 2012

The Lord Bude posted:

Would your performance issue be CPU related though? When I was playing borderlands on my gtx680 the cpu was an old Phenom IIx6 1090t.

I'm at a loss as to what else it could be. 1TB SSD, 16GB RAM, 780ti, month old install of Windows 8.1 Pro 64bit. Probably doesn't help that my stock cooler is mounted badly and the CPU has been hitting the 80's while gaming. I previously playing borderlands fine on dual GTX570s though so I have no idea. It's weird.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

BurritoJustice posted:

I'm at a loss as to what else it could be. 1TB SSD, 16GB RAM, 780ti, month old install of Windows 8.1 Pro 64bit. Probably doesn't help that my stock cooler is mounted badly and the CPU has been hitting the 80's while gaming. I previously playing borderlands fine on dual GTX570s though so I have no idea. It's weird.

Something to do with pysX maybe? I've got nothing.

icantfindaname
Jul 1, 2008


BurritoJustice posted:

I'm at a loss as to what else it could be. 1TB SSD, 16GB RAM, 780ti, month old install of Windows 8.1 Pro 64bit. Probably doesn't help that my stock cooler is mounted badly and the CPU has been hitting the 80's while gaming. I previously playing borderlands fine on dual GTX570s though so I have no idea. It's weird.

If the CPU is getting hot enough to be throttling you should remount the cooler. It could get damaged otherwise. Throttling would probably account for what's happening, though you could check to be sure

BurritoJustice
Oct 9, 2012

icantfindaname posted:

If the CPU is getting hot enough to be throttling you should remount the cooler. It could get damaged otherwise. Throttling would probably account for what's happening, though you could check to be sure

I ordered a new cooler today that I'm going to use to eliminate the temperature issues on my CPU in the short term, and to overclock my CPU to see if there was a bottleneck in the slightly longer term.

Panty Saluter
Jan 17, 2004

Making learning fun!

Factory Factory posted:

Aw yeah, VGA passthrough to Voodoo2 passthrough to MPEG decoder. All the color-keying VGA overlay. What's that, how do I take a screenshot? I don't.

pro as h*ck

I had a Voodoo 2 for like a week or something (IIRC it was like $80 at Electronics Boutique when I bought it). I was mightily impressed with its ability to run Quake II at 800 x 600 with cool lighting effects (I only tried 800 x 600 on a lark since even 640 x 480 was almost a magical pipe dream without a 3D accelerator at the time). Then I decided that if one was good two was better and I decided to spend my next paycheck on a second one even though I probably didn't really need it. When I went back I found out that the Voodoo 3 was only $20 more, and it was basically an SLI'd Voodoo 2 with better 2D acceleration than I had with whatever 2D card I was using at the time (some 4MB ATi? maybe? or was it only 2MB?). So I bought that and the week old Voodoo 2 collected dust and I don't even remember what became of it. This is at a time when I wasn't pulling in more than $150 a week too.


Thanks for reading this episode of "why someone else should be handling my financial affairs".

kode54
Nov 26, 2007

aka kuroshi
Fun Shoe
Better than my story. All I could afford from loving Best Buy was a Creative 3D Blaster Exxtreme, which was a lovely 3DLabs Permedia 2 card with 4MB of memory. It couldn't even run Quake, and if I tried to run Thief, walls were only textured with the light map, so spotting the carpet in the tutorial was impossible.

I even returned and later re-bought it. Meh.

Tried to buy an add-in card in its place once, it failed to boot with a non dedicated card, because it was a lovely Packard Bell desktop with integrated graphics which the BIOS disabled on boot if a video card was plugged in, even if it wasn't a complete video card.

Irresponsibility even led me to attempt flashing the card with incompatible firmware. The only fix was to hot plug the card post boot, but before Windows 98 booted, then reflash it from there with the correct firmware. First time I ever experienced multi monitor, too.

Torpor
Oct 20, 2008

.. and now for my next trick, I'll pretend to be a political commentator...

HONK HONK
Does mixing a geforce and radeon card cause a net negative if you are running several monitors? Do the drivers conflict and cause a mess? I have two geforce cards at the moment and was thinking of updating one of the cards. I am mostly worried about driver or communication problems between the cards rather than features like SLI or what have you.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
On Windows 7 or later (maybe even Vista), it's a total non-issue. You can even move rendering windows between screens, and the original GPU set to render that window will continue to do so.

Factory Factory fucked around with this message at 16:33 on Sep 7, 2014

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

The Lord Bude posted:

Something to do with pysX maybe? I've got nothing.

I was going to say that, turn off PhysX in the game settings completely to troubleshoot.

Personally I think Borderlands' cell-shading looks awful and there's an .ini flag to completely turn it off, and doing so happens to give the entire game like a 10% FPS boost because it's some pretty involved post-processing.

Panty Saluter
Jan 17, 2004

Making learning fun!

kode54 posted:

Better than my story. All I could afford from loving Best Buy was a Creative 3D Blaster Exxtreme, which was a lovely 3DLabs Permedia 2 card with 4MB of memory. It couldn't even run Quake, and if I tried to run Thief, walls were only textured with the light map, so spotting the carpet in the tutorial was impossible.

I even returned and later re-bought it. Meh.

Tried to buy an add-in card in its place once, it failed to boot with a non dedicated card, because it was a lovely Packard Bell desktop with integrated graphics which the BIOS disabled on boot if a video card was plugged in, even if it wasn't a complete video card.

Irresponsibility even led me to attempt flashing the card with incompatible firmware. The only fix was to hot plug the card post boot, but before Windows 98 booted, then reflash it from there with the correct firmware. First time I ever experienced multi monitor, too.

Oh hey, I didn't know you posted here. I enjoy many of your Foobar plugins :cheers:

I don't know, my story didn't involve nearly as much hacking and yours has that plus utterly wasted money. It's always nice to know that I'm not the only person with utterly poo poo impulse control.

kode54
Nov 26, 2007

aka kuroshi
Fun Shoe

Panty Saluter posted:

Oh hey, I didn't know you posted here. I enjoy many of your Foobar plugins :cheers:

I don't know, my story didn't involve nearly as much hacking and yours has that plus utterly wasted money. It's always nice to know that I'm not the only person with utterly poo poo impulse control.

Thanks, glad to know I brightened your computer using experience.

As for waste of money, the computer was purchased as a family machine by parents, years ago. My first Windows PC. The Permedia 2 wasn't a total waste, as it was a step up from the Cirrus Logic dumb frame buffer chip it came with. Not much of one, but better than nothing.

That computer is gone now, but I still have the video card littering a spot in my headboard cabinet.

Geemer
Nov 4, 2010



Are we talking about lovely old GPUs we had?

I arrived way late to the PC gaming scene, having grown up with a SNES and N64. First computer with an actual video card worth mentioning I had was a TNT2 on a Windows 2000 computer that my dad got through his work.
Apparently there was an issue with TNT2, Windows 2000 and OpenGL that Nvidia refused to fix. If you actually dared using OpenGL on that card and OS combo, it'd work for five minutes and then your system would freeze hard.
And then you couldn't reboot because it wouldn't POST until the charge on the card had dissipated and it had cooled down completely. I still don't really know what the hell was up with that, but they were some good bullshit times.

It wasn't until years later when I bought an HD4850 that I even bothered trying PC gaming again.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
As a kid my first discrete GPU was a Matrox Mystique. I couldn't get Descent: Freespace to run on it, though, so we ended up moving into the 21st century with a shiny Rage 128 Pro :hellyeah:

Always wanted a Voodoo but my my dad would never go for it. I think the need for a second GPU to handle 2d rendering probably scared him away.

Paul MaudDib fucked around with this message at 19:34 on Sep 7, 2014

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply