|
AMD should sue Qualcomm for allowing them to make the worst business decision of the century. Even the judge would give them an "aww, you dumb kids!" kind of look before dismissing it.
|
# ? Sep 6, 2014 07:45 |
|
|
# ? May 11, 2024 08:29 |
|
No Gravitas posted:Isn't that connector a standard length? Or the screws? Or the VGA ports are a standard size? You can probably use proportions from a known quantity and calculate out exactly what you want. gently caress, wish I would have thought of that. Thanks!
|
# ? Sep 6, 2014 08:54 |
|
Factory Factory posted:This isn't patent trolling. According to Nvidia, it tried to work out licensing with Samsung for a long time before filing this suit. The GeForce 256 was the first totally integrated GPU. Unified shaders were a major shift in GPU design. Why shouldn't these things be patented? They're not even software techniques, but rather hard-goods inventions. It looks to me like Samsung is just being caught in the crossfire due to being a major purchaser of mobile chips, honestly. Nvidia wants to hit Qualcomm and take a big chunk out of their wallet at the same time. Nvidia wouldn't need to work out a licensing agreement with Samsung as they aren't in the business of producing mobile graphics chips.
|
# ? Sep 6, 2014 09:13 |
|
I'm still mad at NVidia for destroying 3DFX. I was a diehard and I actually ran a Voodoo 5 5500 dual GPU card (which basically competed with the original Geforece, despite coming out when the Gefore 2 debuted). I played Unreal Tournament and Quake 3 just fine, thank you very much. I'm AMD all the way every since. I have a stupid grudge, luckily, ATI and AMD haven't let me down yet.
|
# ? Sep 6, 2014 10:39 |
|
Mad_Lion posted:I'm still mad at NVidia for destroying 3DFX. this is like me with steam because I bought the half-life retail pack and found out I couldn't come back to it to play online then ten years later I started playing computer games and
|
# ? Sep 6, 2014 11:38 |
|
spasticColon posted:Why page 256? Is that number significant is some way? its the maximum number of values represented in an 8 bit word vv Sidesaddle Cavalry posted:this is like me with steam because I bought the half-life retail pack and found out I couldn't come back to it to play online I bought the retail pack years before Steam and they still honored my keys. It ruled since I could finally stop carting those CDs around.
|
# ? Sep 6, 2014 13:29 |
|
isndl posted:It looks to me like Samsung is just being caught in the crossfire due to being a major purchaser of mobile chips, honestly. Nvidia wants to hit Qualcomm and take a big chunk out of their wallet at the same time. Nvidia wouldn't need to work out a licensing agreement with Samsung as they aren't in the business of producing mobile graphics chips. Not the IP, but they absolutely do produce chips with GPUs - their Exynos SoCs use Mali and PowerVR GPU blocks.e
|
# ? Sep 6, 2014 14:20 |
|
Sidesaddle Cavalry posted:this is like me with steam because I bought the half-life retail pack and found out I couldn't come back to it to play online This is no one's fault but your own: You can redeem your Half Life key with Steam, Valve will even throw in all the add ons and some additional multiplayer titles/mods for free. https://support.steampowered.com/kb_article.php?ref=7480-wusf-3601
|
# ? Sep 6, 2014 14:40 |
|
Factory Factory posted:Not the IP, but they absolutely do produce chips with GPUs - their Exynos SoCs use Mali and PowerVR GPU blocks.e Well, I guess that shows how much I pay attention to the whole SoC scene. Samsung does look like a more legitimate target in that case.
|
# ? Sep 6, 2014 15:57 |
|
Regarding the AMD 285 cards - it's a pretty safe bet that a 285X will follow shortly, right? I haven't seen any announcements yet, but it seems almost certain at this point.
|
# ? Sep 6, 2014 17:26 |
|
i am so happy i have a 290x for the freesyncs mmmmmmm delissshhhh i am so content ent ent ent. (USER WAS PUT ON PROBATION FOR THIS POST)
|
# ? Sep 6, 2014 17:34 |
|
Mad_Lion posted:I'm still mad at NVidia for destroying 3DFX. I was a diehard and I actually ran a Voodoo 5 5500 dual GPU card (which basically competed with the original Geforece, despite coming out when the Gefore 2 debuted). Haha I remember voodoo cards. I didn't know what happened to them so I read the wiki article and it seems pretty tame, I don't know what you're referring to. It implied that the company folded on itself, couldn't compete, filed for bankruptcy, and was bought out by nvidia. I'd understand if it was a healthy company and then bought out and torn down, but it seems they even offered free geforce replacements which seems almost nice more than anything.
|
# ? Sep 7, 2014 04:01 |
|
1gnoirents posted:I didn't know what happened to them so I read the wiki article and it seems pretty tame, I don't know what you're referring to. It implied that the company folded on itself, couldn't compete, filed for bankruptcy, and was bought out by nvidia. I'd understand if it was a healthy company and then bought out and torn down, but it seems they even offered free geforce replacements which seems almost nice more than anything. There was a big to-do about it because one of the ways NVidia managed to get ahead was (reportedly, anyhow) basically a massive amount of IP infringement on 3DFX's technologies, like SLI. When they started to go belly-up, NVidia bought them and adroitly avoided any risk of having to deal with a protracted court case. The reality of the situation may or may not be a bit different, but that's how a lot of people looked at it at the time, and still feel NVidia somehow robbed us all of the glory that would have been further Voodoo cards. With external power bricks. Yeah.
|
# ? Sep 7, 2014 04:31 |
|
DrDork posted:There was a big to-do about it because one of the ways NVidia managed to get ahead was (reportedly, anyhow) basically a massive amount of IP infringement on 3DFX's technologies, like SLI. When they started to go belly-up, NVidia bought them and adroitly avoided any risk of having to deal with a protracted court case.
|
# ? Sep 7, 2014 05:31 |
|
Professor Science posted:3dfx died in 2000, long before NV had any sort of SLI support (introduced in NV4x in 2004, I think). 3dfx died because they were incredibly late to hardware T&L, thought Glide would become a permanent monopoly and didn't anticipate D3D/GL correctly, but most importantly because of the disastrous acquisition of STB and the attempt to become a vertically-integrated GPU supplier.
|
# ? Sep 7, 2014 05:33 |
|
Had a VooDoo2 card whatup crew
|
# ? Sep 7, 2014 06:01 |
|
Aw yeah, VGA passthrough to Voodoo2 passthrough to MPEG decoder. All the color-keying VGA overlay. What's that, how do I take a screenshot? I don't.
|
# ? Sep 7, 2014 06:04 |
|
Had a Voodoo Banshee. It was sweet until it died. The SiS piece of poo poo replacement couldn't even run its own manufacturer-provided benchmark without lagging out, but I somehow put up with it until a friend eventually gave me a Voodoo3 out of pity. I've never really used Nvidia cards either, but it has nothing to do with 3DFX but mostly that the first cards I happened to personally buy were from the 9000-series and then ATI replaced my in-house X850pro a week out of warranty when they really shouldn't have. It ... kinda had UV paint all over it. Bought themselves a long-term customer though. future ghost fucked around with this message at 06:20 on Sep 7, 2014 |
# ? Sep 7, 2014 06:13 |
|
Is something wrong if my brand new 780ti is seeing drops into the teens while playing borderlands 2? It is only averaging around the 50s too. Brand new MSI 780ti, stock everything, i5 3570k. I'm at 1440p. Edit: I ran unigine heaven and got 1521, searching around this is about 250 less than other people with 780ti's are getting. BurritoJustice fucked around with this message at 08:43 on Sep 7, 2014 |
# ? Sep 7, 2014 08:30 |
|
DrDork posted:You're probably correct--all you have to do is look at the late model/not really released voodoo 5 series to see that they dun hosed up. At the time, though, there was a lot of fanboi angst and anger and general feelings that NVidia had been underhanded about the whole thing. In the end it's kinda irrelevant how exactly it happened; 3DFX wasn't going to survive anyhow. Wait nvidia was the shady one? thought only 3dfx was trying to squeeze every out of everyone with their own proprietary glide api and shutting down people's glide emulation with attempt with lawsuit. They even cut off the manufacturers when they bought STB but turned out "made in mexico" wasnt that good. But either way the main downfall was that they had lower fps in quake 3
|
# ? Sep 7, 2014 09:15 |
|
BurritoJustice posted:Is something wrong if my brand new 780ti is seeing drops into the teens while playing borderlands 2? It is only averaging around the 50s too. Brand new MSI 780ti, stock everything, i5 3570k. I'm at 1440p. I was playing borderlands 2 just fine on my 680, drops into the teens does seem weird. Have you tried the usual troubleshooting stuff? latest drivers, backing off on any overclocks, etc?
|
# ? Sep 7, 2014 09:28 |
|
Latest drivers, stock clocks. I've used DDU to remove the drivers and reinstalled them with a clean install. Still disappointingly low performance in the games I've tried.
|
# ? Sep 7, 2014 10:02 |
|
BurritoJustice posted:Latest drivers, stock clocks. what settings are you using in Uniengine heaven? I'll replicate the same settings and see what result I get. Also what Nvidia Control panel settings.
|
# ? Sep 7, 2014 10:17 |
|
The Lord Bude posted:what settings are you using in Uniengine heaven? I'll replicate the same settings and see what result I get. Also what Nvidia Control panel settings. The 1512 was done with 8x AA, 1920x1080, tessellation extreme. I haven't touched the control panel since clean installing.
|
# ? Sep 7, 2014 10:41 |
|
BurritoJustice posted:The 1512 was done with 8x AA, 1920x1080, tessellation extreme. I haven't touched the control panel since clean installing. I got 1630. min fps 26, max 136, avg 64. I was also running 'ultra' quality level, you didn't specify which one you were using. I reset my control panel settings to default for the test. I also have a 4770K though, which might have impacted the benchmark. I've never seen borderlands 2 drop into the teens though, even with my old 680.
|
# ? Sep 7, 2014 11:01 |
|
The Lord Bude posted:I got 1630. Thanks for your help. I went crazy and ordered a TC14PE, to try and see what I can wring out of this CPU. EDIT: Can the voltage be changed on the stock BIOS? Afterburner allows an offset up to +75mV but the core voltage stays at 1.175 max no matter what. BurritoJustice fucked around with this message at 11:16 on Sep 7, 2014 |
# ? Sep 7, 2014 11:06 |
|
BurritoJustice posted:Thanks for your help. I went crazy and ordered a TC14PE, to try and see what I can wring out of this CPU. Would your performance issue be CPU related though? When I was playing borderlands on my gtx680 the cpu was an old Phenom IIx6 1090t.
|
# ? Sep 7, 2014 11:15 |
|
The Lord Bude posted:Would your performance issue be CPU related though? When I was playing borderlands on my gtx680 the cpu was an old Phenom IIx6 1090t. I'm at a loss as to what else it could be. 1TB SSD, 16GB RAM, 780ti, month old install of Windows 8.1 Pro 64bit. Probably doesn't help that my stock cooler is mounted badly and the CPU has been hitting the 80's while gaming. I previously playing borderlands fine on dual GTX570s though so I have no idea. It's weird.
|
# ? Sep 7, 2014 11:45 |
|
BurritoJustice posted:I'm at a loss as to what else it could be. 1TB SSD, 16GB RAM, 780ti, month old install of Windows 8.1 Pro 64bit. Probably doesn't help that my stock cooler is mounted badly and the CPU has been hitting the 80's while gaming. I previously playing borderlands fine on dual GTX570s though so I have no idea. It's weird. Something to do with pysX maybe? I've got nothing.
|
# ? Sep 7, 2014 12:02 |
|
BurritoJustice posted:I'm at a loss as to what else it could be. 1TB SSD, 16GB RAM, 780ti, month old install of Windows 8.1 Pro 64bit. Probably doesn't help that my stock cooler is mounted badly and the CPU has been hitting the 80's while gaming. I previously playing borderlands fine on dual GTX570s though so I have no idea. It's weird. If the CPU is getting hot enough to be throttling you should remount the cooler. It could get damaged otherwise. Throttling would probably account for what's happening, though you could check to be sure
|
# ? Sep 7, 2014 12:24 |
|
icantfindaname posted:If the CPU is getting hot enough to be throttling you should remount the cooler. It could get damaged otherwise. Throttling would probably account for what's happening, though you could check to be sure I ordered a new cooler today that I'm going to use to eliminate the temperature issues on my CPU in the short term, and to overclock my CPU to see if there was a bottleneck in the slightly longer term.
|
# ? Sep 7, 2014 12:55 |
|
Factory Factory posted:Aw yeah, VGA passthrough to Voodoo2 passthrough to MPEG decoder. All the color-keying VGA overlay. What's that, how do I take a screenshot? I don't. pro as h*ck I had a Voodoo 2 for like a week or something (IIRC it was like $80 at Electronics Boutique when I bought it). I was mightily impressed with its ability to run Quake II at 800 x 600 with cool lighting effects (I only tried 800 x 600 on a lark since even 640 x 480 was almost a magical pipe dream without a 3D accelerator at the time). Then I decided that if one was good two was better and I decided to spend my next paycheck on a second one even though I probably didn't really need it. When I went back I found out that the Voodoo 3 was only $20 more, and it was basically an SLI'd Voodoo 2 with better 2D acceleration than I had with whatever 2D card I was using at the time (some 4MB ATi? maybe? or was it only 2MB?). So I bought that and the week old Voodoo 2 collected dust and I don't even remember what became of it. This is at a time when I wasn't pulling in more than $150 a week too. Thanks for reading this episode of "why someone else should be handling my financial affairs".
|
# ? Sep 7, 2014 13:26 |
|
Better than my story. All I could afford from loving Best Buy was a Creative 3D Blaster Exxtreme, which was a lovely 3DLabs Permedia 2 card with 4MB of memory. It couldn't even run Quake, and if I tried to run Thief, walls were only textured with the light map, so spotting the carpet in the tutorial was impossible. I even returned and later re-bought it. Meh. Tried to buy an add-in card in its place once, it failed to boot with a non dedicated card, because it was a lovely Packard Bell desktop with integrated graphics which the BIOS disabled on boot if a video card was plugged in, even if it wasn't a complete video card. Irresponsibility even led me to attempt flashing the card with incompatible firmware. The only fix was to hot plug the card post boot, but before Windows 98 booted, then reflash it from there with the correct firmware. First time I ever experienced multi monitor, too.
|
# ? Sep 7, 2014 14:44 |
|
Does mixing a geforce and radeon card cause a net negative if you are running several monitors? Do the drivers conflict and cause a mess? I have two geforce cards at the moment and was thinking of updating one of the cards. I am mostly worried about driver or communication problems between the cards rather than features like SLI or what have you.
|
# ? Sep 7, 2014 16:25 |
|
On Windows 7 or later (maybe even Vista), it's a total non-issue. You can even move rendering windows between screens, and the original GPU set to render that window will continue to do so.
Factory Factory fucked around with this message at 16:33 on Sep 7, 2014 |
# ? Sep 7, 2014 16:30 |
|
The Lord Bude posted:Something to do with pysX maybe? I've got nothing. I was going to say that, turn off PhysX in the game settings completely to troubleshoot. Personally I think Borderlands' cell-shading looks awful and there's an .ini flag to completely turn it off, and doing so happens to give the entire game like a 10% FPS boost because it's some pretty involved post-processing.
|
# ? Sep 7, 2014 16:34 |
|
kode54 posted:Better than my story. All I could afford from loving Best Buy was a Creative 3D Blaster Exxtreme, which was a lovely 3DLabs Permedia 2 card with 4MB of memory. It couldn't even run Quake, and if I tried to run Thief, walls were only textured with the light map, so spotting the carpet in the tutorial was impossible. Oh hey, I didn't know you posted here. I enjoy many of your Foobar plugins I don't know, my story didn't involve nearly as much hacking and yours has that plus utterly wasted money. It's always nice to know that I'm not the only person with utterly poo poo impulse control.
|
# ? Sep 7, 2014 16:43 |
|
Panty Saluter posted:Oh hey, I didn't know you posted here. I enjoy many of your Foobar plugins Thanks, glad to know I brightened your computer using experience. As for waste of money, the computer was purchased as a family machine by parents, years ago. My first Windows PC. The Permedia 2 wasn't a total waste, as it was a step up from the Cirrus Logic dumb frame buffer chip it came with. Not much of one, but better than nothing. That computer is gone now, but I still have the video card littering a spot in my headboard cabinet.
|
# ? Sep 7, 2014 16:55 |
|
Are we talking about lovely old GPUs we had? I arrived way late to the PC gaming scene, having grown up with a SNES and N64. First computer with an actual video card worth mentioning I had was a TNT2 on a Windows 2000 computer that my dad got through his work. Apparently there was an issue with TNT2, Windows 2000 and OpenGL that Nvidia refused to fix. If you actually dared using OpenGL on that card and OS combo, it'd work for five minutes and then your system would freeze hard. And then you couldn't reboot because it wouldn't POST until the charge on the card had dissipated and it had cooled down completely. I still don't really know what the hell was up with that, but they were some It wasn't until years later when I bought an HD4850 that I even bothered trying PC gaming again.
|
# ? Sep 7, 2014 19:13 |
|
|
# ? May 11, 2024 08:29 |
|
As a kid my first discrete GPU was a Matrox Mystique. I couldn't get Descent: Freespace to run on it, though, so we ended up moving into the 21st century with a shiny Rage 128 Pro Always wanted a Voodoo but my my dad would never go for it. I think the need for a second GPU to handle 2d rendering probably scared him away. Paul MaudDib fucked around with this message at 19:34 on Sep 7, 2014 |
# ? Sep 7, 2014 19:18 |