Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Kraftwerk posted:

Just a quick tally again-
Who plans to get a reference card for the long run and who's going AIB?

I'm planning on going reference. For how my case's cooling is set up it should work very well. I have my AIO radiator paired with a push/pull setup as my intake, that air will go to the GPU, the hot air will then go out the back of the GPU or get sucked up by my back exhaust, the hot air shouldn't effect the CPU as much as it does know when the hot air gets recirculated in the case much more.

Adbot
ADBOT LOVES YOU

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

BurritoJustice posted:

I wonder if a single site will do 3090 NVLINK gaming benchmarks lol

Count on Linus to do it at some point.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Black monolith case crew represent. My Fractal Define S can handle up to a 425mm GPU and it's glorious black steel prevents me from seeing the technicolor puke fest going on in my case. Everything remotely decent performance seems to have RGB nowadays.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

BurritoJustice posted:

Holy poo poo r/amd is melting to the ground right now.

Check this genius maths


2080=1080=R7?

Confusing 30% less versus 50% more? Stretching a 5700xt to a 3090 somehow?

The current top trending thread in r/AMD is called "I am genuinely scared for AMD GPU's".

Today is not a good day to be an AMD fanboy. And it'll be a really bad day if AMD doesn't say anything once September 14th hits and people can actually buy Ampere cards.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
I was thinking about selling the 2080 Ti I have for like 600 to recoup costs, but one of the downsides of a Ryzen CPU is if I sell my current GPU and don't buy another one in the interim, I won't have a working PC. So I'll just hold onto it and watch as it drops in value so fast that r/wallstreetbets will be impressed. I'm probably just gonna end up giving it to my brother in law who wants a gaming PC and whose exposure to PC gaming at this point is playing PSO 2 on a table with Intel iGPU.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

shrike82 posted:

lol hope you didn't buy NVDA right after the Ampere announcement, it's down 10% the past day

:rip:

AMD was recently up 6% after they announced the released of the...RX 5300. Stonks makes no sense when it comes to product launches.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Rubellavator posted:

Am I missing something cause looking at the dimensions it seems like the 3080 FE is smaller than my Zotac 1070?

Quite possibly so. The 2080 FE is only a dual-slot design measuring at 285mm x 112mm. The triple fan Zotac 1070 looks to be 303mm x 111.5mm.

The 3090's are the ones with the CHONK coolers.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Paul MaudDib posted:

Igor’s lab: big Navi will slot between 3070 and 3080 at 275W or match 3080 at 300W. No partner cards this year.

:chloe:

https://www.youtube.com/watch?v=lpTPzoBWR4Q&t=28s

Igor's lab also reported that the AIB's didn't have the bill of materials yet for the the card as of a couple weeks ago. It sounds like it would be basically impossible for them to actually have product available for the Big Navi launch if that is the case. So expect a pittance of reference cards at launch so AMD can claim that Big Navi hit its intended release date, with AIB's being delayed to 2021.

And ya'll though NV's launch was looking paper thin, this is looking real dire.

At least AMD has the opportunity to ignore TDP like Nvidia has and just send out juiced to the gills cards like this was some Fury X poo poo. They may be able to look remotely competitive at that point, as long as you ignore the wall of features Nvidia has that AMD won't. And presume their drivers actually work as intended.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

repiv posted:

turing also has the full dx12 ultimate feature set, which has been neglected so far but should start seeing use as games begin to target to new consoles

there's some nice performance wins to be had there

I think the first game to demonstrate these performance gains is Minecraft RTX beta. It recently got upgraded to DXR 1.1 support and the performance gains are massive, up to like 50% from what I've seen on my card. It's also really buggy and crashing a lot now, but that's why it's a beta.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

This won't stop a Saudi Oil Baron's kid from buying the GPU, need to price it even higher.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

shrike82 posted:

20% over the 2080ti at 1440p for actual games is a bit more muted than i think what people were expecting

At 1440p I expect the 3080 to be CPU bottlenecked, so a lower gap between the 2080 Ti and 3080 at that resolution than at 4K seems expected.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Taima posted:

They said it.

https://www.nvidia.com/en-us/geforce/news/rtx-3090-out-september-24/


hash tag #lit

The 3090 is insanely loving stupid unless you have a specific use case. I've been prepped to buy one since they were announced but I just can't. It's not a money issue, it's a "I would feel like a gullible piece of poo poo" issue.

I wonder if the 3090 FE is extremely power limited compared to the 3080 FE, it's only got 20 more watts to work with for 20% more cores and more than double the VRAM. I'll need to see how the AIB 3090's end up for overall performance boost, if it's more consistently 15-20% on the AIB's I'll go that route despite the poor value, but if it trends closer to 10% I'll just probably get the 3080 to replace my 2080 Ti. Losing 1GB of VRAM shouldn't be an issue since I don't play at 4K anyway, 3440x1440 on my main monitor and 2880x1600 on the Index.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Taima posted:

So who is in 10-15% performance for 115% extra cost crew?? Holla

I'm out on the 3090. My mental breakpoint was probably a consistent 15% with some games seeing closer to 20%, but it's closer to 10%. The ROG Strix is a bit faster than other cards if you're willing to give a card 480W but at that point it's not worth it, especially since the Strix is a solid 300 bucks more than MSRP.

Will start F5'ing for a 3080 to replace my 2080 Ti.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Shogunner posted:

lmao this video rules, i'm starting to really like this guy

thank you goons + tech jesus for talking me out of a 3090, which i only wanted for flight sim VR and turns out it gets worse 1 fps on average at 1440p lmfao rip

Tech Jesus doesn't give a gently caress. His primary interest is empirical testing of computer components and your value will be determined on how well it tests. He's not one of the conductors of the Hype Train that so many techtubers are. This has also got him heat in the past from companies who send him stuff to review, only to get their products poo poo on because they were failures during his empirical testing. His "Zen is a i7 in production, but an i5 in gaming" video also got r/AMD to be mad at him for a while. And if you make r/AMD made, you're doing a good job with your life.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Looking at everything now and I think I'll end up getting one of the 3x 8-pin 3080's like a FTW3 or Asus Strix. Looks like they should be good for another 5% or so more performance through ridiculous wattage and will close the gap between the 3080 and the 2x-8 pin 3090's even further. The Asus Strix 3090 that Techpowerup reviewed performed more like how I wish a 3090 would have in general, a 19% overall improvement over a 3080 FE, but that's a power limited 3080 vs a 3090 that was using up to 480w. A 3080 with a high power limit should close that gap 5-10%, stuff like the Asus TUF OC is already 3-4% faster than the FE without needing to go 3x 8-pin.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

AirRaid posted:

I did some comparison testing with the new driver.

I did 3 Time Spy runs before upgrading, and 3 after. Results with GPU-Z pasted on are here - https://imgur.com/a/PHiHSMX
GPU Core, Max Power and Max Voltage are showing Max figures because the kopite kimi guy on twitter posited they might be lowering clocks or messing with voltage/power caps.

TLDR: 3 runs on the old driver netted graphics scores of 14955, 14994, 15157, with max clock 1950, Max power ~353 and Max Voltage 1.0810V

3 New driver runs saw graphics scores of 16996, 16844, 16858, Max clock split between 1935 and 1950, Max power 1 or 2 W higher, Max Voltage the same.

Interestingly, my CPU score also took a jump of about 250-300 points, bringing the overall average scores higher by ~1200 with the new driver.

Not the most thorough of tests but a really clear improvement in synthetic performance, at least.

I'm seeing similar reported results in the Nvidia subreddit. Crashing was resolved with the newest driver and it looks like this new driver is more performant on top of it.

Let's see if it was really was just a driver issue this whole time and not a complete failure of Nvidia and their AIB's engineering.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Alchenar posted:

AMD absolutely does not get credit for having their drivers 'pretty much fixed' by the time of the entirely next generation launch.

Sure they do. It's called Finewine and it's been a favorite topic of discussion among techtubers for years. Who doesn't love seen how the R9 380 caught up to the GTX 970 2 years after both cards were EOL'd.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

His conclusion seems to be that the issues are more likely than not software related, the boost algorithm was just going too hard and the AIB's didn't have enough time to test the cards before releasing to adjust. Better capacitors could obviously help against instability, but he couldn't recall a situation where the type of capacitor used significantly effected boost clock potential. Nvidia's driver release yesterday that adjusted boosts slightly seems to have fixed the boost related crashes for most users.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Well at least Capgate was bullshit. So people don't need to worry about poo poo like cap configurations on their GPU's on top of mashing F5 with their faces to try to get one.

I might go to Microcenter later just for the laughs, don't expect to walk out with anything. I'm a bit worried because if they tell me they don't have 3080's but do have 3090's, I'd probably break down and get the 3090. I had already saved the money and mentally prepared myself for spending 3090 money, but I know rationally it's an enormous waste. But I bought a 2080 Ti, what the gently caress do I know about rationality.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Space Racist posted:

Right, but if you're capping frames, why would Vsync matter if you're never exceeding the max refresh?

Vsync and other frame rate caps aren't perfect and you can still sometimes get frame rates above your screen limit. This is why -3 frames is recommended.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Bloopsy posted:

Back to the HDMI 2.1 chat...I bought this exact same 15ft cable from Amazon that Vincent used figuring that despite the warnings of going over 5ft it worked for him so it was worth a try. It arrived today and after a few hours of play with Cyberpunk and RDR2 I can say definitively that it does work great. No flicker, blackouts, or other graphical issues at 4k/120hz. RDR2 was the most egregious offender with my prior 2.1 cables and it worked flawlessly. I would say it's worth a buy.


https://www.amazon.com/48Gbps-Compa...194&sr=8-3&th=1

Good to hear a positive experience. I have the same cable but I have yet to be able to test it at 4K 120hz 4:4:4 chroma as I don't have an RTX 3080 or PS5 yet, but it's been working great for me at 4K 60hz at least.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Well I finally joined one of those stock discords a couple weeks ago, finally paid off today. Was able to score an RTX 3090 FE off Best Buy, at MSRP for a 3090 and with the stimulus check in hand, I figured I might as well stimulate the economy. Most RTX 3080's are going to be pushing close to a grand after tax and shipping now with the tariff price increases, the stimulus pretty much covered the price difference for moving up to the 3090.

My LG CX will finally be fed properly for 4K 120hz support.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Thanks to the ATR STONKS stock discord I was able to score a 3090 FE off Best Buy earlier this week, picked it up today. Sweet mercy is this thing huge, while my case was plenty big to handle it, I did have to move around a fair bit of wiring so that the fans of the thing weren't picking up cables. But it's pretty cool be playing a game, particularly one that doesn't totally max out the card, and it can sit there at 800 RPM even though it's using near 350w of power, because temps are only in the high 50's to low 60's.

I had talked myself out of getting a 3090 as the price/performance is obviously trash on the card, but with the 600 dollars worth of coronabucks being deposited in my account and the ability to score a 3090 FE at MSRP without any tariff shenanigans, why not go in. I almost spent 1000 dollars on a 3080 from Zotac the day before, but the website crapped itself while I was putting in the order. With tax and shipping a decent 3080 is probably going to be around 1000 bucks now here in the US thanks to the tariffs.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Also crypto mining is back with a vengeance with Bitcoin booming, it's dragging other coins up in value like Etherum which are still feasible to mine with GPU's.

So there's about a half dozen reasons right now why GPU prices are ridiculous.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

buglord posted:

I feel like more console games would look (and run) nicer with AA disabled. That and motion blur seem to be the thing always on by default on console - and the game is usually worse for it. AA seemed to make more sense when I was playing at a resolution below 1080p, but at 1440p the jaggies aren’t too noticeable for me like they once were. Does that become less so at 4k?

Nope, AA is still as important than ever even at high resolution. Temporal aliasing is a big issue in IQ for overall image stability that isn't solved by more pixels. Temporal Aliasing is generally worse at higher resolution than lower resolution since temporal aliasing is an effect of not producing enough frames per second to smooth out the image. Many modern engines are also dependent on TAA being on for graphics to display properly, your average game engine is now is deferred rendering, which generates your image from multiple separate frames. You need a TAA solution to make that not look like poo poo in motion with pixel creeping and other anomalies. TAA softens the image, but game developers are getting better at fixing that with a post processing sharpening pass that cleans up the image.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Fauxtool posted:

bios on a mobo is far more puckering. At least a lot of gpus have dual bios.

And as long as you have another GPU, you can usually reflash a GPU BIOS update that goes bad.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

The Slack Lagoon posted:

If I'm getting a 3070TI and I have a 650 w PSU should I upgrade to the reccomended 750w? All I have is an i7-8700, 4 sticks of ram, and 2 ssds.

Your PSU is more than enough for your system. A 3070 Ti should be using around 300w of power. A Core i7-8700 looks to max out at 115w of power, so that's roughly 415w of power needed. The rest of the system together is a trivial amount of wattage in comparison, maybe up to like 450w or so all together? Maybe 500w if your 3070 Ti has a higher power limit than the FE.

GPU manufacturers always overestimate PSU requirements to cover for people have bad PSU's. Presuming you have a decent quality PSU, you should have no issues at all.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

MikeC posted:

What? When was this? I don't see anything in his channel. Did he delete it?

I believe he made a separate channel after his tech channel got laughed out into irrelevance. That's the one where he's goes full nutter with conspiracy theories.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Subjunctive posted:

My 3090 is loud enough when playing FFXIV or RDR2 that I’m tempted to go for a swap to an AIO. It’s an MSI Ventus and I have a decently-sized Be Quiet! case so I think it’s be fine geometrically.

What AIO kits are good for such purposes? Am I likely to gently caress anything up putting it on?

E: hmm, the only AIOs I can find are for reference layouts, which I believe my Ventus is not. I don’t want to have to make a custom loop…

Is your GPU getting loud due to the core getting hot or the VRAM getting hot? Swapping the thermal pads on the VRAM might be a better idea to start with if you haven't done so already. I don't have a Ventus but a 3090 FE and the only reason my GPU's fans ever really ramp up isn't due to core temp due to VRAM temps.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Finally got around to spending the time and money doing the thermal pad swap on my 3090 FE. I was getting really tired of hearing the card ramp up to 100% fan speed at 2600 RPM when I would load up certain games that would hit the VRAM hard, the Guardians of the Galaxy game being a recent example of a game that was spiking my VRAM temps.

Ended up using Gelid Extreme 1.5mm on the front VRAM, Gelid Extreme 2.0mm on the front VRM and Gelid Ultimate 2.0mm on the back of the card. Dropped my max VRAM temps from 106C down to 96C with them being closer to 90C in normal gaming loads. More importantly my GPU fan is no longer ramping up past 1400 RPM or so to handle the GPU temps anymore. This has had the effect of also slightly increasing my GPU temps since my GPU fan is no longer ramping up high anymore to deal with VRAM temp issues, this is a common thing that seems to be happening to people who have done the thermal pad mods that sometimes causes some concern.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

ijyt posted:

I have a bit of the 1.5mm thermal pads left, tempted to do it for my 3080 - how much did you end up using as if I remember it's like $60 for a 50mm square.

e: whoops that was meant to quote the 3090 thermal pad swapping dude

I ended up using this thermal pad cutting template I found on the NV subreddit. Has measurements for all the pad cuts you need so you can work from that.

https://drive.google.com/file/d/18rPk56D8gdOPSzdKH4sC0SCKelHtBGnV/view

Adbot
ADBOT LOVES YOU

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Cabbages and Kings posted:

Has anyone with a FE card done thermal pad replacement?

3090FE memory runs hotter than PNY 3080, by 5-6 degrees. The PNY didn't punch over 100C on 1440p gaming; can hit 102-106 easily here, at least in VR. The card sits right under a 240mm exhaust.

Seems hot but it doesn't throttle until 110 I guess, and I'd rather not invalidate the warranty.

Is there a way to get HWINFO to only log one value? Trying to play VR and then dig through the csv data is ugly even w/VSC and a csv plugin

I did the thermal pad swap last week. I was having about the same temps as you, I saw peaks of 106C on the VRAM while gaming at which point the GPU fan would go 100% speed 2600 RPM to keep the VRAM cool enough to prevent thermal throttling. Since the thermal pad swap the highest I've seen is 98C in Guardians of the Galaxy which seems to hit VRAM as hard as mining eth for some reason. Most of the time it's in the upper 80's to low 90's in gaming now and my fan speeds don't go above 1400 RPM Now.

If you are an American at least, swapping the thermal pads does not invalidate the warranty, as long as Nvidia can't prove you broke the card explicitly with the thermal pad swap. If you're concerned about really breaking something, doing just the back of the card is very safe to do as it doesn't require doing anything other than removing the backplate and when I did just the back I saw enough of a drop in temps that my cards fans were not ramping up anymore. It was scary at first doing the whole card but with the right tools it was quite easy, the most annoying parts of the 3090 FE disassembly were the magnetic screw covers on the back of the card which required me to go buy some Gorilla tape to remove. I'd also recommend having some tweezers to be able to remove the LED cable if you do the full swap.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply