Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Squibbles posted:

I had a Matrox Mystique. It was interesting because it was nearly as fast as a 3dfx card but had basically zero features. No coloured lighting, no transparency, etc. In Jedi Knight multiplayer if someone used force blind on you it just turned every other pixel on your display white instead of actually blinding you and slowly letting your vision fade back in.

I also had an S3 virge card which was neat but also commonly known as a 3d-deccelerator since games would actually run better in software most of the time, though wouldn't look as fancy.

Sadly the Diamond cards mostly had relatively boring box art with spaceships or airplanes on the front

The Mystique was notable because it was superficially really fast, but - as you say - lacked a number of features which rapidly became very important. Among them:
* antialiasing
* mipmapping
* bilinear filtering
* fogging
* hardware transparency

Given that dearth of features - the chip was probably missing a number of blending modes too, my memory’s just too rusty - the Mystique worked about as well as it could have. For 2D it was enviable at the time; I kept one with a 3Dfx Voodoo1 in one computer or another for years.

The Virge had a decent feature set but was designed to manage software-quality rendering at lower resolutions, not to be brutally proficient. Layer on bilinear filtering, on-chip z-buffering, perspective correct texture mapping, and a full suite of effects, and it tanked. Later Virges were twice as fast as the originals, but by then S3 was making the most of a bad situation and trying to shove the Savage cards out the door. Which was its own saga...

Adbot
ADBOT LOVES YOU

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
Hey goons, I snagged a Radeon Founder's Edition for scientific computing with some light gaming on the side. The default clocks and thermals are pretty... optimistic, given its cooling, and I've had some luck with undervolting and selective downclocking while playing with the power limit. But here's my question: does anyone here have the stock Vega 64 pstate voltages and clockspeed values? I've had a bear of a time finding them online, and would love to have access to that data as a saner baseline than what the card tries to push by default.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
Out of curiosity, in terms of performance and especially compute, how much of an upgrade would a 2070 (Super) be over a GTX Titan Xm? I’d imagine the RTX 3*** cards will have people dumping their old ones and I’m not above scavenging.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
Onboard S3 Trio32/64
Matrox Mystique
3Dfx Voodoo Graphics
3Dfx Voodoo3
PowerVR Kyro II
Geforce3
GeForce FX 5900XT
GeForce 7800GS
GeForce 9600GT
GeForce GTX 550 Ti
GeForce GTX 660
Radeon RX 480
GeForce GTX 1070 Ti
Radeon Vega Frontier Edition

This does not include all the other systems I put together because I was bored along the way... if you can name a graphics card make and model from the last twenty-five years, I’ve probably at least played with it.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
Yeah, this screams "we failed our console cert" and now they have to go through the boondoggle again. Godspeed to them - if they miss Christmas, there'll be hell to pay. :bahgawd:

If it's one thing I'm not interested in doing, it's hitching my wagon to Nvidia's star on the back of their proprietary tech. My needs are basically 1440p75, with the expectation the card will last half a decade and possibly be amenable to some scientific computing and machine learning. If the RX 6800's a solid 3080 competitor with more RAM, I won't hesitate to grab one.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

The OpenCL driver will probably work in Windows, so it'll run BOINC@Home. And at some point ROCm support will (hopefully? prayerfully?) show up in Linux, at which point I'll dual boot it and use it in conjunction with my GTX Titan X. Thanks for the condescension, all the same.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

jkyuusai posted:

Yeah wait, is this like.... they put a conditional in the TV's firmware that says "process the signal differently if the input I'm running on is named PC and has this specific icon"?????

It’s a weird shorthand way of indicating, “this input will not benefit from being run through whatever DSP is in place to perform MPEG noise removal/frame interpolation/other image enhancing features, so just pass it through as fast as possible to mitigate latency.” At least, I assume that hasn’t changed much from my circa 2009 HDTV...

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
God, I just want an RX 6800 on launch day without having to enter a pact with a lower power. Any pointers?

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Some Goon posted:

Have lived a pure and kindly life so that a higher power might help you?

Pure, no. Kindly, yes. Mostly I just wish I knew what *time* to start hammering places. Sounds like Best Buy has been more reliable than NewEgg for RTX sourcing, so maybe I'll get lucky there.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

repiv posted:

females on blower are back, in anime titty form

https://twitter.com/VideoCardz/status/1325854488540946433

See, this just makes me want Yeston to make a Cute Pet 2 RX 6800.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

th3t00t posted:

So I managed to snag a 3080 from Newegg right after my zotac 3070 shipped and could no longer cancel it.

What are the rules for selling it to a goon? I just want to break even at $576, if anyone is interested. I don’t have PMs.

It arrived today and the box is bigger than I was expecting.

I'm tempted to buy it from you, impending RX 6800 release be damned... If you still have it, let me know.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Zeta Acosta posted:

whats most likely: radeon 600, rx 5800 or rx 6700?

RX 6700, either of the others would confuse the hell out of people. Not that the others are impossible, but unified branding is something AMD will want to project if they have any control over it.

Hell, I wish a 12GB RX 6700 were available right now.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Handsome Ralph posted:

In the off chance I can help a Goon who wants a 3070 and has had poo poo luck, before I cancel it, does anyone want this?

https://www.amazon.com/gp/product/B08KXZV626/ref=ppx_yo_dt_b_asin_title_o00_s00?ie=UTF8&psc=1

Me please, depending on price

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
Just wanted to add that Handsome Ralph's coming through for me with an RTX 3070. God bless you, goon.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

sauer kraut posted:

There is no reference 590, what you probably meant is that whoever built your card used an inadequate entry level cooling solution for a 225W chip plus memory etc on top that is pushed way beyond where it should be (ie the 480).

Very much so. The 580 itself amounted to a factory overclocked and overvolted RX 480 with minor bug fixes to power delivery, where a 6% performance improvement was accomplished by ramping up TDP from 150 to 185W. The 590 doubled down on this with a die shrink from 14nm to 12nm, and a performance jump of around 12% facilitated by TDP growing again to 225W. So for less than a 20% gain in performance from an 8GB RX 480, the power budget mushroomed 50%. And it sounds like whoever built OP's card scrimped to save a few pennies. Underclocking 5 to 10% could put the core closer to its sweet spot and allow for a disproportionately nice undervolt, and a custom fan curve could possibly out some of that fan noise.

Or you can just accept the 590 at face value and wear headphones. That's legit too.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Inept posted:

Mocking weird anime boob cards isn't performative but simply recognizing a fact that they're hilarious and gross. See also; video card with lips.

It's descended from the same ridicule we used to give to cards with artwork of cyborg frogs and big robots and fairies on them. The difference now is that this artwork is more akin to what you'd expect from someone hawking their OnlyFans, except it's on a GPGPU. It's objectifying and weird and a bridge too far for a lot of people. They should never have messed with perfection.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Theophany posted:

seta com_maxfps 43/76/125 for the Quake 3 engine. Allowed you to pull off trickjumps and I think strafejumping?

You could also set a shitload of variables to control how frequently you pinged the server.

God, I'm having flashbacks. A lot of guys who wanted to Git Gud but didn't have much money opted for Nvidia TNT2s with all options turned down far past minimum settings (r_picmip 32 in one guy's config made me laugh out loud) because they had solid triangle throughput. At that point you were basically just pushing triangles as fast as you could go with trivial texturing requirements and some simple lovely sprite items for visibility. They swore by it, but it struck me as too damned ugly to be worthwhile.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Handsome Ralph posted:

Asked this earlier and didn't see a response so I figured I'd ask one more time.

Anyone else after installing a 3080 suddenly get 4 beeps on POST?

Apparently it's only an issue for 3080 FE cards but there have been zero issues otherwise and the card runs fine for me. It's such an odd and random quirk I figured I'd see if anyone else here ran into it.

Pardon the Reddit link, that's where I first found that it was a 3080 issue.
https://www.reddit.com/r/pcmasterrace/comments/j18rf7/mainboard_beeping_4_short_times_after_installing/

It sounds to me like a legacy BIOS compatibility problem on systems with CSM enabled. The CSM bitches that it can't find a compatible BIOS for the card, then the UEFI takes over after the CSM issues the four beeps of consternation.

Weird, either way.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Handsome Ralph posted:

Huh, that is weird. Mystery solved, I guess. Thanks!

Sure! Disabling CSM will probably fix the issue outright.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
Well, I've gotta say, the Gigabyte Eagle RTX 3070 is deliriously good. After using a procession of AMD cards I'm shocked by how LITTLE this thing is - long but narrow and surprisingly light. It's also quiet and profanely fast, and feels like heaven after dealing with a finicky Vega Frontier Edition for months. As it's connected to a vintage Dell 1200p monitor and a 1440p75 monitor, I predict this will last me for the next five years, easy. Thanks again to Handsome Ralph for making this possible!

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
Having just snagged an RTX 3070, I wish I had enough time and money to grab an RX 6800 too. Of course that'd be contingent on actually being able to source one... but for now it'll be RTX 3070 on my Windows machine, Vega Frontier Edition in my Linux box. C'mon, ROCm, get RDNA(2) working already.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

gradenko_2000 posted:

so TPU's review of the RX 6800 [Flat] has it consistently ahead of the RTX 3070 ... buuuuuuut if the 6800 is also more expensive than the RTX 3070 then the value proposition is still toast.

For the extra cash you get higher rasterization performance and 8GB extra RAM, but give up DLSS in the short to medium term and have provably lower raytracing performance. It's a tossup, but if I hadn't suddenly had the chance to snag an RTX 3070 I'd leap for a vanilla 6800 myself.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

bus hustler posted:

OK the Accelero IV is ugly as poo poo and I am going to 100% need to redo this installation but it's SO MUCH QUIETER than my stock blower on the 2080ti.

I regret buying a new CPU cooler actually, it was only the GPU that was loud.

But I bought some heatsinks to use per many many recommendations and the ones I bought just... suck. They came with thermal adhesive that sucks and some fell off. I can't let it stay like that. Oh well, it was a pretty easy install. The 100% reference 2080ti board is actually pretty simple with almost no heat from the bottom.

I'm glad that worked out for you. I have a Core i9-7940x with a Vega Frontier Edition, and am finally just sick to death of the card's noisy stock cooler. Daring to load the GPU sounds like an angry box fan - even trying to undervolt and underclock only does so much to alleviate the droning sound. With any luck the Raijintek Morpheus II Vega cooler I snagged a few days ago will tame it. Wish me luck.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

hobbesmaster posted:

Why would you want a 980ti, retailers still want over $500 for them for some dumb reason. Look:
https://www.newegg.com/msi-geforce-gtx-980-ti-gtx-980ti-lightning-le/p/N82E16814127910?Item=9SIAE8DBX68930&quicklink=true
I see them used on ebay for $350. Again, $50-100 more than a 1660.

Benchmarks: https://www.techspot.com/review/1808-geforce-gtx-1660-ti-vs-rtx-2060-vs-gtx-980-ti/

Secondhand I see them go for closer to $160 to $180, which is closer to rational. I sold a Titan X last month for $250, which is basically a perfect rational asking price for a 980 Ti with a big enough framebuffer for machine learning applications and 1080p60 with texture storage to spare. eBay skews high for this stuff. Those retailers selling new parts are either hoping for collectors or stubbornly refuse to update in hopes somebody will come along desperate or clueless enough to pay.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
I know it's not a new piece of hardware, but after enduring my Vega Frontier Edition's godawful noise and thermals I plonked on a Raijintek Morpheus II and two 12.5mm-high 120mm Noctua fans. It is a MASSIVE visual downgrade, but I went from routinely skirting overheating with a noise profile like a box fan in a dirty factory to gently audible with temps that never nudge above 80 Celsius in exhaustive torture tests. Recommended - though maybe I should have gone with be quiet! fans for aesthetic reasons.

Hasturtium fucked around with this message at 20:24 on Nov 24, 2020

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Internet Explorer posted:

It's beautiful, and I will not have you talk poorly about such a precious baby.

Well, aren't you sweet. Here, have a topless shot.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Sininu posted:

You made a very good decision

Here's mine!



Very nice! The walkthrough I followed indicated that heatsinks on the chokes weren't crucial versus heatsinks on the VRMs - did I err by not sticking some on? It's been solid and stable so I haven't been that worried, but I wanna be sure I did this right.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Sininu posted:

It's cute


I don't know for sure but I feel like the walkthrough must've been right. I don't have heatsinks on my VRAM because there wasn't enough clearance and this thread assured me it was fine.

What would even happen if VRAM ran at too high temperatures? I want to know.

I'm a geologist, not an electrical engineer, but spitballing it: above thermal specifications, you'd start getting errors and instability, much like overaggressive RAM overclocks would result in. Get it legitimately too hot in a sustained way and you could actually damage the chips, or maybe even fry the IMC. Alternatively, based on the recent Gamers Nexus video, wait six months to a year and see what happens to PlayStation 5s!

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

kloa posted:

slim noctua buds :hfive:



They are really good. I've got a pair of super quiet grey ones in a different PC, but they'd have added to a height requirement that already felt ostentatious. You have a great-looking setup!

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

sean10mm posted:

I mean, the 6800XT also eats poo poo in RT performance vs. the 3080 in Control, 3dMark Port Royal, Metro Exodus and Battlefield 5. Are they all nvidia shills?

The 6800 series is very good if you don't care about RT performance (which is fine!), but if you do it's a big yikes.

If - and it's a big if - AMD creates an open DLSS alternative with similar results - can we assume their raytracing performance amounts to 50-60% of Nvidia's current generation? Or is that too optimistic?

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Kunabomber posted:

All pretty good points - I guess I'm tired of F5ing and finding justifications.

This reminds me of the time when I went for the Kyro 2, thinking hardware T&L wasn't that big of a deal. :v:

This isn't quite like that... The Kyro (2) was a TNT2-class graphics card with DXT1 compression support and a hell of a knack for overdraw mitigation, at a point when texture compression, hardware T&L, cubemapping, and stencil buffer shadows were available from competitors and the first primitive pixel and vertex shaders were coming around. The new Radeons functionally do most of what the Geforces do, outside of lacking dedicated tensor units (which we've seen hobbles raytracing performance) and machine learning supersampling, which is allegedly coming down the pike. They don't scale as well to 4K either, but outside of those weaknesses they're very competitive. I'd have gone for one this time around if I didn't get a hot lead on an RTX 3070.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Subjunctive posted:

How does the lack of tensor units hurt RT performance? I thought it was a lack of dedicated BVH hardware that was causing them problems.

I was under the impression that tensor units could be useful for denoising operations, at least. But if they don't make an appreciable difference and I'm talking out of my rear end, I'd like to know that.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

repiv posted:

There's a tensor-based denoiser in OptiX for offline renderers, but no games use the tensor cores for denoising

DLSS is currently the only application of the tensor cores in real-time rendering

Okay, good to know. What's the likelihood that a DLSS-style solution won't work as well without them?

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Saturnine Aberrance posted:

3080 just arrived; I've got it installed now but I'm getting some intense color banding in my quick tests in Control and Quake 2. Not sure what's going on. Best guess is driver update weirdness.

You're not alone, I've also noticed banding with my 3070 that I didn't see on my Vega Frontier Edition. It's not the Nvidia driver setting limited range color over HDMI for my monitor, either.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Some Goon posted:

What's the best gpu you can get with vga out?

My 12GB GTX Titan X had a DVI-I port. I don't think I ever had occasion to connect a VGA monitor to it, but the passive dongle I keep in my parts bin should have worked fine. A 980 Ti's basically the same card with 6GB RAM, and is a lot cheaper and easier to find, so for native support that's probably the ticket. As others have said, you can get a dongle that will work with HDMI or DisplayPort on anything, but it's an active converter and will cost surprisingly much for the, uh, privilege.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

wibble posted:

I've built a "spare" Pc for emergency or special projects from spare bits. The only missing bit is a graphics card.
I was somewhat naive, thinking I could spend $40 and pick something nice up but its crazy out there on ebay... Was searching for stuff like 970,960, 1050 or 1060.
What other cards should I expand my search to? I need DP, HDMI and the case is mATX so cant take a full sized triple fan cards.

I've got a 3GB Radeon R9 280X that could go to a good home. It's on par with a 1050 Ti, though it drinks power by comparison, and has two mini-DisplayPorts, an HDMI, and a DVI-I connector. PM me - I'll make you a fair deal. It's in barely used condition.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
I dug The New Order's gunplay and really liked the way the writers pulled off a magical realism story with Nazis, hypertechnology, and robots. It felt like that soured into an alternately dour and tasteless pulp story in New Colossus, and I gave up around Area 52. I keep meaning to go back, and one day I will, but I honestly don't like it as much. Youngblood looked like more of what I disliked from Colossus combined with microtransactions and weird difficulty scaling, which made the decision to avoid it easy.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

MaxxBot posted:

Yeah I just sold my 2080 Ti for $800 and I see some going for $900+ now, if you're really willing to drop $900 on a used 2080 Ti at the end of 2020 I don't get why you don't just buy a scalped 3080 but whatever.

Guaranteed availability - the 3080's got more variables attached than someone selling a card they can demonstrate owning and can drop in the mail within 24 hours. Props on your sale, that's a nice chunk of change.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Arvid posted:

Back in 2013 i built a PC for the living room in a Streacom FC5 cabinet with a i3 processor. Bought a 65" Samsung Q90T at black friday and promptly discovered that the integrated graphics on the i3 can not actually do 4K.
Figured I could perhaps add a video card to it to enable 4K? What would be he cheapest GPU with passive cooling that supports 4K? According to the Streacom web site it seems the maximum length of the card is 180mm.
Guess I will also need a riser cable since the GPU must be mounted horizontally, never used a riser cable before, can I just go with the cheapest one I can find?

180mm isn't a ton of room... I'd guess a passively cooled Geforce GT 1030 would work. Maybe like this one?

Adbot
ADBOT LOVES YOU

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Dravs posted:

I am very happy though, I am upgrading from an Rx vega 56, which has to be undervolted and run with ancient drivers because otherwise it constantly crashes my PC. I can't express enough my disappointment with it.

Oh hey, Vega buddy. I had an Asus Arez Vega 56 that was rock solid, but traded it with a little cash for a Vega Frontier Edition that was a nightmarish hassle of undervolting and underclocking until I installed a Raijintek Morpheus II Vega Edition heatsink on it. Even now I downclock and undervolt a little for peace of mind... Sustained, unlimited hammering of the GPU (like Doom Eternal with vsync turned off, or Tensorflow) can still see my thermal junction temperatures creep closer to the mid-80s than I'd like. I could have slapped on bigger, louder fans, but I got a good deal on a pair of slim Noctuas that are barely audible. YMMV.

I will say that the Gigabyte 3070 Eagle I grabbed with Handsome Ralph's help has been awesome. Good luck to you!

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply