|
Squibbles posted:I had a Matrox Mystique. It was interesting because it was nearly as fast as a 3dfx card but had basically zero features. No coloured lighting, no transparency, etc. In Jedi Knight multiplayer if someone used force blind on you it just turned every other pixel on your display white instead of actually blinding you and slowly letting your vision fade back in. The Mystique was notable because it was superficially really fast, but - as you say - lacked a number of features which rapidly became very important. Among them: * antialiasing * mipmapping * bilinear filtering * fogging * hardware transparency Given that dearth of features - the chip was probably missing a number of blending modes too, my memory’s just too rusty - the Mystique worked about as well as it could have. For 2D it was enviable at the time; I kept one with a 3Dfx Voodoo1 in one computer or another for years. The Virge had a decent feature set but was designed to manage software-quality rendering at lower resolutions, not to be brutally proficient. Layer on bilinear filtering, on-chip z-buffering, perspective correct texture mapping, and a full suite of effects, and it tanked. Later Virges were twice as fast as the originals, but by then S3 was making the most of a bad situation and trying to shove the Savage cards out the door. Which was its own saga...
|
# ¿ Jun 17, 2020 06:43 |
|
|
# ¿ Apr 27, 2024 03:40 |
|
Hey goons, I snagged a Radeon Founder's Edition for scientific computing with some light gaming on the side. The default clocks and thermals are pretty... optimistic, given its cooling, and I've had some luck with undervolting and selective downclocking while playing with the power limit. But here's my question: does anyone here have the stock Vega 64 pstate voltages and clockspeed values? I've had a bear of a time finding them online, and would love to have access to that data as a saner baseline than what the card tries to push by default.
|
# ¿ Jul 31, 2020 18:10 |
|
Out of curiosity, in terms of performance and especially compute, how much of an upgrade would a 2070 (Super) be over a GTX Titan Xm? I’d imagine the RTX 3*** cards will have people dumping their old ones and I’m not above scavenging.
|
# ¿ Aug 9, 2020 21:20 |
|
Onboard S3 Trio32/64 Matrox Mystique 3Dfx Voodoo Graphics 3Dfx Voodoo3 PowerVR Kyro II Geforce3 GeForce FX 5900XT GeForce 7800GS GeForce 9600GT GeForce GTX 550 Ti GeForce GTX 660 Radeon RX 480 GeForce GTX 1070 Ti Radeon Vega Frontier Edition This does not include all the other systems I put together because I was bored along the way... if you can name a graphics card make and model from the last twenty-five years, I’ve probably at least played with it.
|
# ¿ Aug 12, 2020 14:24 |
|
Yeah, this screams "we failed our console cert" and now they have to go through the boondoggle again. Godspeed to them - if they miss Christmas, there'll be hell to pay. If it's one thing I'm not interested in doing, it's hitching my wagon to Nvidia's star on the back of their proprietary tech. My needs are basically 1440p75, with the expectation the card will last half a decade and possibly be amenable to some scientific computing and machine learning. If the RX 6800's a solid 3080 competitor with more RAM, I won't hesitate to grab one.
|
# ¿ Oct 27, 2020 20:21 |
|
The OpenCL driver will probably work in Windows, so it'll run BOINC@Home. And at some point ROCm support will (hopefully? prayerfully?) show up in Linux, at which point I'll dual boot it and use it in conjunction with my GTX Titan X. Thanks for the condescension, all the same.
|
# ¿ Oct 27, 2020 20:25 |
|
jkyuusai posted:Yeah wait, is this like.... they put a conditional in the TV's firmware that says "process the signal differently if the input I'm running on is named PC and has this specific icon"????? It’s a weird shorthand way of indicating, “this input will not benefit from being run through whatever DSP is in place to perform MPEG noise removal/frame interpolation/other image enhancing features, so just pass it through as fast as possible to mitigate latency.” At least, I assume that hasn’t changed much from my circa 2009 HDTV...
|
# ¿ Oct 27, 2020 21:54 |
|
God, I just want an RX 6800 on launch day without having to enter a pact with a lower power. Any pointers?
|
# ¿ Nov 9, 2020 17:15 |
|
Some Goon posted:Have lived a pure and kindly life so that a higher power might help you? Pure, no. Kindly, yes. Mostly I just wish I knew what *time* to start hammering places. Sounds like Best Buy has been more reliable than NewEgg for RTX sourcing, so maybe I'll get lucky there.
|
# ¿ Nov 9, 2020 17:37 |
|
repiv posted:females on blower are back, in anime titty form See, this just makes me want Yeston to make a Cute Pet 2 RX 6800.
|
# ¿ Nov 9, 2020 18:48 |
|
th3t00t posted:So I managed to snag a 3080 from Newegg right after my zotac 3070 shipped and could no longer cancel it. I'm tempted to buy it from you, impending RX 6800 release be damned... If you still have it, let me know.
|
# ¿ Nov 9, 2020 20:17 |
|
Zeta Acosta posted:whats most likely: radeon 600, rx 5800 or rx 6700? RX 6700, either of the others would confuse the hell out of people. Not that the others are impossible, but unified branding is something AMD will want to project if they have any control over it. Hell, I wish a 12GB RX 6700 were available right now.
|
# ¿ Nov 10, 2020 14:34 |
|
Handsome Ralph posted:In the off chance I can help a Goon who wants a 3070 and has had poo poo luck, before I cancel it, does anyone want this? Me please, depending on price
|
# ¿ Nov 10, 2020 17:03 |
|
Just wanted to add that Handsome Ralph's coming through for me with an RTX 3070. God bless you, goon.
|
# ¿ Nov 10, 2020 18:39 |
|
sauer kraut posted:There is no reference 590, what you probably meant is that whoever built your card used an inadequate entry level cooling solution for a 225W chip plus memory etc on top that is pushed way beyond where it should be (ie the 480). Very much so. The 580 itself amounted to a factory overclocked and overvolted RX 480 with minor bug fixes to power delivery, where a 6% performance improvement was accomplished by ramping up TDP from 150 to 185W. The 590 doubled down on this with a die shrink from 14nm to 12nm, and a performance jump of around 12% facilitated by TDP growing again to 225W. So for less than a 20% gain in performance from an 8GB RX 480, the power budget mushroomed 50%. And it sounds like whoever built OP's card scrimped to save a few pennies. Underclocking 5 to 10% could put the core closer to its sweet spot and allow for a disproportionately nice undervolt, and a custom fan curve could possibly out some of that fan noise. Or you can just accept the 590 at face value and wear headphones. That's legit too.
|
# ¿ Nov 10, 2020 23:52 |
|
Inept posted:Mocking weird anime boob cards isn't performative but simply recognizing a fact that they're hilarious and gross. See also; video card with lips. It's descended from the same ridicule we used to give to cards with artwork of cyborg frogs and big robots and fairies on them. The difference now is that this artwork is more akin to what you'd expect from someone hawking their OnlyFans, except it's on a GPGPU. It's objectifying and weird and a bridge too far for a lot of people. They should never have messed with perfection.
|
# ¿ Nov 11, 2020 01:14 |
|
Theophany posted:seta com_maxfps 43/76/125 for the Quake 3 engine. Allowed you to pull off trickjumps and I think strafejumping? God, I'm having flashbacks. A lot of guys who wanted to Git Gud but didn't have much money opted for Nvidia TNT2s with all options turned down far past minimum settings (r_picmip 32 in one guy's config made me laugh out loud) because they had solid triangle throughput. At that point you were basically just pushing triangles as fast as you could go with trivial texturing requirements and some simple lovely sprite items for visibility. They swore by it, but it struck me as too damned ugly to be worthwhile.
|
# ¿ Nov 16, 2020 19:09 |
|
Handsome Ralph posted:Asked this earlier and didn't see a response so I figured I'd ask one more time. It sounds to me like a legacy BIOS compatibility problem on systems with CSM enabled. The CSM bitches that it can't find a compatible BIOS for the card, then the UEFI takes over after the CSM issues the four beeps of consternation. Weird, either way.
|
# ¿ Nov 16, 2020 21:56 |
|
Handsome Ralph posted:Huh, that is weird. Mystery solved, I guess. Thanks! Sure! Disabling CSM will probably fix the issue outright.
|
# ¿ Nov 16, 2020 22:23 |
|
Well, I've gotta say, the Gigabyte Eagle RTX 3070 is deliriously good. After using a procession of AMD cards I'm shocked by how LITTLE this thing is - long but narrow and surprisingly light. It's also quiet and profanely fast, and feels like heaven after dealing with a finicky Vega Frontier Edition for months. As it's connected to a vintage Dell 1200p monitor and a 1440p75 monitor, I predict this will last me for the next five years, easy. Thanks again to Handsome Ralph for making this possible!
|
# ¿ Nov 17, 2020 15:36 |
|
Having just snagged an RTX 3070, I wish I had enough time and money to grab an RX 6800 too. Of course that'd be contingent on actually being able to source one... but for now it'll be RTX 3070 on my Windows machine, Vega Frontier Edition in my Linux box. C'mon, ROCm, get RDNA(2) working already.
|
# ¿ Nov 18, 2020 16:01 |
|
gradenko_2000 posted:so TPU's review of the RX 6800 [Flat] has it consistently ahead of the RTX 3070 ... buuuuuuut if the 6800 is also more expensive than the RTX 3070 then the value proposition is still toast. For the extra cash you get higher rasterization performance and 8GB extra RAM, but give up DLSS in the short to medium term and have provably lower raytracing performance. It's a tossup, but if I hadn't suddenly had the chance to snag an RTX 3070 I'd leap for a vanilla 6800 myself.
|
# ¿ Nov 18, 2020 17:05 |
|
bus hustler posted:OK the Accelero IV is ugly as poo poo and I am going to 100% need to redo this installation but it's SO MUCH QUIETER than my stock blower on the 2080ti. I'm glad that worked out for you. I have a Core i9-7940x with a Vega Frontier Edition, and am finally just sick to death of the card's noisy stock cooler. Daring to load the GPU sounds like an angry box fan - even trying to undervolt and underclock only does so much to alleviate the droning sound. With any luck the Raijintek Morpheus II Vega cooler I snagged a few days ago will tame it. Wish me luck.
|
# ¿ Nov 19, 2020 18:09 |
|
hobbesmaster posted:Why would you want a 980ti, retailers still want over $500 for them for some dumb reason. Look: Secondhand I see them go for closer to $160 to $180, which is closer to rational. I sold a Titan X last month for $250, which is basically a perfect rational asking price for a 980 Ti with a big enough framebuffer for machine learning applications and 1080p60 with texture storage to spare. eBay skews high for this stuff. Those retailers selling new parts are either hoping for collectors or stubbornly refuse to update in hopes somebody will come along desperate or clueless enough to pay.
|
# ¿ Nov 21, 2020 23:45 |
|
I know it's not a new piece of hardware, but after enduring my Vega Frontier Edition's godawful noise and thermals I plonked on a Raijintek Morpheus II and two 12.5mm-high 120mm Noctua fans. It is a MASSIVE visual downgrade, but I went from routinely skirting overheating with a noise profile like a box fan in a dirty factory to gently audible with temps that never nudge above 80 Celsius in exhaustive torture tests. Recommended - though maybe I should have gone with be quiet! fans for aesthetic reasons. Hasturtium fucked around with this message at 20:24 on Nov 24, 2020 |
# ¿ Nov 24, 2020 20:18 |
|
Internet Explorer posted:It's beautiful, and I will not have you talk poorly about such a precious baby. Well, aren't you sweet. Here, have a topless shot.
|
# ¿ Nov 24, 2020 20:25 |
|
Sininu posted:You made a very good decision Very nice! The walkthrough I followed indicated that heatsinks on the chokes weren't crucial versus heatsinks on the VRMs - did I err by not sticking some on? It's been solid and stable so I haven't been that worried, but I wanna be sure I did this right.
|
# ¿ Nov 24, 2020 20:39 |
|
Sininu posted:It's cute I'm a geologist, not an electrical engineer, but spitballing it: above thermal specifications, you'd start getting errors and instability, much like overaggressive RAM overclocks would result in. Get it legitimately too hot in a sustained way and you could actually damage the chips, or maybe even fry the IMC. Alternatively, based on the recent Gamers Nexus video, wait six months to a year and see what happens to PlayStation 5s!
|
# ¿ Nov 24, 2020 21:13 |
|
kloa posted:slim noctua buds They are really good. I've got a pair of super quiet grey ones in a different PC, but they'd have added to a height requirement that already felt ostentatious. You have a great-looking setup!
|
# ¿ Nov 24, 2020 21:38 |
|
sean10mm posted:I mean, the 6800XT also eats poo poo in RT performance vs. the 3080 in Control, 3dMark Port Royal, Metro Exodus and Battlefield 5. Are they all nvidia shills? If - and it's a big if - AMD creates an open DLSS alternative with similar results - can we assume their raytracing performance amounts to 50-60% of Nvidia's current generation? Or is that too optimistic?
|
# ¿ Nov 24, 2020 23:19 |
|
Kunabomber posted:All pretty good points - I guess I'm tired of F5ing and finding justifications. This isn't quite like that... The Kyro (2) was a TNT2-class graphics card with DXT1 compression support and a hell of a knack for overdraw mitigation, at a point when texture compression, hardware T&L, cubemapping, and stencil buffer shadows were available from competitors and the first primitive pixel and vertex shaders were coming around. The new Radeons functionally do most of what the Geforces do, outside of lacking dedicated tensor units (which we've seen hobbles raytracing performance) and machine learning supersampling, which is allegedly coming down the pike. They don't scale as well to 4K either, but outside of those weaknesses they're very competitive. I'd have gone for one this time around if I didn't get a hot lead on an RTX 3070.
|
# ¿ Nov 25, 2020 00:27 |
|
Subjunctive posted:How does the lack of tensor units hurt RT performance? I thought it was a lack of dedicated BVH hardware that was causing them problems. I was under the impression that tensor units could be useful for denoising operations, at least. But if they don't make an appreciable difference and I'm talking out of my rear end, I'd like to know that.
|
# ¿ Nov 25, 2020 01:21 |
|
repiv posted:There's a tensor-based denoiser in OptiX for offline renderers, but no games use the tensor cores for denoising Okay, good to know. What's the likelihood that a DLSS-style solution won't work as well without them?
|
# ¿ Nov 25, 2020 01:51 |
|
Saturnine Aberrance posted:3080 just arrived; I've got it installed now but I'm getting some intense color banding in my quick tests in Control and Quake 2. Not sure what's going on. Best guess is driver update weirdness. You're not alone, I've also noticed banding with my 3070 that I didn't see on my Vega Frontier Edition. It's not the Nvidia driver setting limited range color over HDMI for my monitor, either.
|
# ¿ Nov 28, 2020 02:01 |
|
Some Goon posted:What's the best gpu you can get with vga out? My 12GB GTX Titan X had a DVI-I port. I don't think I ever had occasion to connect a VGA monitor to it, but the passive dongle I keep in my parts bin should have worked fine. A 980 Ti's basically the same card with 6GB RAM, and is a lot cheaper and easier to find, so for native support that's probably the ticket. As others have said, you can get a dongle that will work with HDMI or DisplayPort on anything, but it's an active converter and will cost surprisingly much for the, uh, privilege.
|
# ¿ Nov 29, 2020 06:41 |
|
wibble posted:I've built a "spare" Pc for emergency or special projects from spare bits. The only missing bit is a graphics card. I've got a 3GB Radeon R9 280X that could go to a good home. It's on par with a 1050 Ti, though it drinks power by comparison, and has two mini-DisplayPorts, an HDMI, and a DVI-I connector. PM me - I'll make you a fair deal. It's in barely used condition.
|
# ¿ Nov 29, 2020 17:44 |
|
I dug The New Order's gunplay and really liked the way the writers pulled off a magical realism story with Nazis, hypertechnology, and robots. It felt like that soured into an alternately dour and tasteless pulp story in New Colossus, and I gave up around Area 52. I keep meaning to go back, and one day I will, but I honestly don't like it as much. Youngblood looked like more of what I disliked from Colossus combined with microtransactions and weird difficulty scaling, which made the decision to avoid it easy.
|
# ¿ Nov 30, 2020 17:35 |
|
MaxxBot posted:Yeah I just sold my 2080 Ti for $800 and I see some going for $900+ now, if you're really willing to drop $900 on a used 2080 Ti at the end of 2020 I don't get why you don't just buy a scalped 3080 but whatever. Guaranteed availability - the 3080's got more variables attached than someone selling a card they can demonstrate owning and can drop in the mail within 24 hours. Props on your sale, that's a nice chunk of change.
|
# ¿ Nov 30, 2020 19:09 |
|
Arvid posted:Back in 2013 i built a PC for the living room in a Streacom FC5 cabinet with a i3 processor. Bought a 65" Samsung Q90T at black friday and promptly discovered that the integrated graphics on the i3 can not actually do 4K. 180mm isn't a ton of room... I'd guess a passively cooled Geforce GT 1030 would work. Maybe like this one?
|
# ¿ Dec 2, 2020 04:41 |
|
|
# ¿ Apr 27, 2024 03:40 |
|
Dravs posted:I am very happy though, I am upgrading from an Rx vega 56, which has to be undervolted and run with ancient drivers because otherwise it constantly crashes my PC. I can't express enough my disappointment with it. Oh hey, Vega buddy. I had an Asus Arez Vega 56 that was rock solid, but traded it with a little cash for a Vega Frontier Edition that was a nightmarish hassle of undervolting and underclocking until I installed a Raijintek Morpheus II Vega Edition heatsink on it. Even now I downclock and undervolt a little for peace of mind... Sustained, unlimited hammering of the GPU (like Doom Eternal with vsync turned off, or Tensorflow) can still see my thermal junction temperatures creep closer to the mid-80s than I'd like. I could have slapped on bigger, louder fans, but I got a good deal on a pair of slim Noctuas that are barely audible. YMMV. I will say that the Gigabyte 3070 Eagle I grabbed with Handsome Ralph's help has been awesome. Good luck to you!
|
# ¿ Dec 4, 2020 17:58 |