Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Cygni
Nov 12, 2005

raring to post

change my name posted:

Business is so cool, you can just make up whatever numbers you feel like huh (unless some of that is already locked in for next year)

it actually proves your point, but fiscal quarters are not aligned with the calendar year and companies use all sorts of wild dates. Q3 FY24 already closed on October 29th, 2023 for Nvidia.

Adbot
ADBOT LOVES YOU

Dr. Video Games 0031
Jul 17, 2004

change my name posted:

Business is so cool, you can just make up whatever numbers you feel like huh (unless some of that is already locked in for next year)

Those are numbers they're reporting to have already happened.

Internet Explorer
Jun 1, 2005





Cyrano4747 posted:

If you've already got a 3090 I really don't see a compelling reason to upgrade this generation at all, unless you're one of those edge cases where you actually need the grunt that a 4090 has.

In your shoes I'd be treading water and eyeballing whatever comes out for the 5080. If family needs a video card urgently and you're feeling generous, I'd probably just look at buying them something new. A 4070 would be a nice Christmas present.

I'm fortunate enough to have the money and I like being able to run high resolutions with all the bells and whistles on. But yeah, I'm thinking it might be pretty hard to justify. I guess we'll see!

Animal
Apr 8, 2003

PCWorld interviews Alex Battaglia about rendering and GPU stuff

https://www.youtube.com/watch?v=live?7TECZHoFBpE

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Lockback posted:

Will depend on a lot.

The 4070 ti Super seems like a good card if it's priced like the 4070 ti. That might be a winning spot for a release cycle that has so few of those.

The 4080 Super might be something if it's priced like the 4080, but the 4080 is priced so badly that I still think it's a bad play. If they somehow decide to do a price cut too that would maybe be rad, but I doubt it will. So it'll probably be a slightly better 4080 for the same price, not very interesting.

The 4070 Super is going to be between the 4070 and 4070 ti Super as the 4070 super IS NOT replacing the 4070. So who knows. There's room there for it to be priced really well, or kinda lovely.

So basically I have some optimism for the 4070ti Super and not much optimism for anything else.

Yeah, the 4070 Ti Super might be the only truly compelling part; if it stays at or near the 4070 Ti's price, and also sees the rumored bump in ram up to 16 GB, then it seems like it all but eliminates any reason to go with a 4080 and maybe even 4080 Super given DLSS/etc. I'd expect it to perform really well at 4K/60 and probably even 4K/120 with DLSS.

Edit:

Cygni posted:

gaming almost back up to crypto bubble levels but holy poo poo that data center...



Won't somebody please think of poor Nvidia's revenue and profit? How can they afford to sell their cards so cheap? :qq qqq:

MH Knights
Aug 4, 2007

Cygni posted:

gaming almost back up to crypto bubble levels but holy poo poo that data center...



Data Center includes AI stuff correct?

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

MH Knights posted:

Data Center includes AI stuff correct?

Yeah

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Internet Explorer posted:

I'm fortunate enough to have the money and I like being able to run high resolutions with all the bells and whistles on. But yeah, I'm thinking it might be pretty hard to justify. I guess we'll see!

I have started practicing my justification for the 5090. I think I’m going to be in good shape by the time it releases.

Internet Explorer
Jun 1, 2005





See, that's why I stay out of this thread. Ya'll are a bad influence. The monitor thread as well.

MarcusSA
Sep 23, 2007

Internet Explorer posted:

See, that's why I stay out of this thread. Ya'll are a bad influence. The monitor thread as well.

Na once you get an OLED monitor that’s the end game.

Internet Explorer
Jun 1, 2005





The reason why I came to ask about a new graphics card was because I was already pushing the 3090 and then I just picked up the G9 OLED. To go along with my two LG C2s in the house. :2monocle:

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Internet Explorer posted:

The reason why I came to ask about a new graphics card was because I was already pushing the 3090 and then I just picked up the G9 OLED. To go along with my two LG C2s in the house. :2monocle:

Realistically, unless you're worried about power consumption (which is totally valid), stick with your 3090 for another generation. The resale value is holding up because of the extra VRAM so I'm sure if you upgrade next gen you'll be able to offset the cost a bit.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Internet Explorer posted:

The reason why I came to ask about a new graphics card was because I was already pushing the 3090 and then I just picked up the G9 OLED. To go along with my two LG C2s in the house. :2monocle:

3090 + DLSS until the 5000-series is out, then splurge on the 5090, echoing this:

change my name posted:

Realistically, unless you're worried about power consumption (which is totally valid), stick with your 3090 for another generation. The resale value is holding up because of the extra VRAM so I'm sure if you upgrade next gen you'll be able to offset the cost a bit.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Internet Explorer posted:

Maybe it's still too early to tell, but how are we feeling about the potential SUPER releases? Going to be something worth buying, or are people who buy it going to be kicking themselves for not waiting for the 5000 series? At least according to the rumor mill, they don't seem like they are going to be released all that far apart.

it's not just the super releases themselves, but also the way the rest of the stack is shuffled around it. some skus are going to get some price cuts.

4070 Ti Super is going to be the interesting one, as others note. There is a lot of margin in AD103, the 4080 is the most overpriced by far, they are totally going to be able to do a $849/899 4080 and have some marginal 4080 super right above it for some silly price. 4070 Ti Super probably takes the 4070 Ti MSRP ($749) but you finally get a real 16gb card.

They will not put AD102 into 4080 Super, not with the AI market this hot. If it's 20GB it has to be a new die. In which case maybe $1100 for 4080 Super 20GB? But probably not. Unless there's some curveball like 20gb a 4080 super probably going to be a massive wet fart.

4060 ti super also meh. AD104 is already way cut down for 4070 and I just don't see them going way lower. They can't cut off like 40-50% of the die for these supers to make performance levels that "make sense". Maybe there's a refresh that moves 4060 ti performance down to the $329/379 price points or something, at most. That would push the 4060 down maybe $20 or something as well. Supposedly 4060 Ti 16GB is selling OK for AI stuff as well which argues against big price cuts.

I don't know about having 4070 ti super and also 4070 super, that rumor doesn't make sense, there's not enough daylight there for 4070, 4070 super, 4070 ti, 4070 ti super imo. maybe it supercedes the old sku and they just sell through the inventory to morons/prebuilts, but, that's a lot of skus in a $200 space.

I agree that I think if you've got a 3090, just wait and buy a 5090, or wait and see what the market looks like then.

Paul MaudDib fucked around with this message at 07:57 on Nov 22, 2023

mobby_6kl
Aug 9, 2009

by Fluffdaddy
I'll just get whatever ends up being the most reasonable $/fps after the Supers drop (maybe even AMD) and that hopefully forces everything to get new pricing. But we'll see, I've been saying I'll get a new GPU since Turing but my 1070 keeps chugging along.

Arzachel
May 12, 2012

Paul MaudDib posted:

I don't know about having 4070 ti super and also 4070 super, that rumor doesn't make sense, there's not enough daylight there for 4070, 4070 super, 4070 ti, 4070 ti super imo. maybe it supercedes the old sku and they just sell through the inventory to morons/prebuilts, but, that's a lot of skus in a $200 space.

Rumour is they've stopped manufacturing the regular 4070 and 4070ti so it would be just the super models once stocks run out. Might take a while for the latter unless there's some really good deals, so it still sounds pretty cramped

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

MarcusSA posted:

Na once you get an OLED monitor that’s the end game.

Until you come back complaining of degradation.

We definitely don't have "perfect" display tech yet

mobby_6kl
Aug 9, 2009

by Fluffdaddy

HalloKitty posted:

Until you come back complaining of degradation.

We definitely don't have "perfect" display tech yet

We just need a monitor subscription where they'll automatically send you an new OLED monitor every 5,000 hours or whatever

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Arzachel posted:

Rumour is they've stopped manufacturing the regular 4070 and 4070ti so it would be just the super models once stocks run out. Might take a while for the latter unless there's some really good deals, so it still sounds pretty cramped

Nope, the 4070ti is going to be stopped but the 4070 is staying. So the new 4070 line is:

4070 --- 4070 Super --- 4070 Ti Super
https://www.dexerto.com/tech/nvidia-leak-says-rumored-4070-super-wont-fully-replace-base-model-2393954/

I would not expect the 4080 to be more than 16GB, but honestly I don't think that's a big deal. For it's performance envelope 16GB seems fine. Its not like you get more performance from 20GB vs 16, and I would question whether the 4080 would have enough oomph to be relevant long enough for 16GB to be a limiter.

FuzzySlippers
Feb 6, 2009

I'm already maxing out my 12gb regularly so I don't think 16 is a lot of headroom.

Cyrano4747
Sep 25, 2006

Yes, I know I'm old, get off my fucking lawn so I can yell at these clouds.

FuzzySlippers posted:

I'm already maxing out my 12gb regularly so I don't think 16 is a lot of headroom.

The question is "doing what?"

1440 vs 4k vs VR vs AI shenanigans etc. is all going to radically change how much vram is enough for someone's use case over whatever kind of timeline you think a video card should be good for.

Saturnine Aberrance
Sep 6, 2010

Creator.

Please make me flesh.


I would run out of ram regularly when I had 16gb playing anything running the Frostbite engine like 5 or more years ago if I didn't remember to kill all extraneous programs ahead of time. 32 is a very reasonable target these days imo.

SpaceDrake
Dec 22, 2006

I can't avoid filling a game with awful memes, even if I want to. It's in my bones...!
This is about VRAM, not normal RAM. (Which you are right about w/r/t 32GB. :v:)

If you want 32 GB of VRAM on something, you'd better be ready to wait a few more years and also be ready to pay $2000.

Branch Nvidian
Nov 29, 2012



Y'all know this is VRAM, not System RAM, right?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
The existence of the Xbox Series S is the only reason 8GB GPUs are going to stay usable at all in AAA games. Otherwise the floor for VRAM in the consoles would be the 10GB fast VRAM in the Series X, and the PS5 can have about 12GB of VRAM if you really squeeze hard.

We really should have 10-12GB in GPUs at the low end, but memory controllers and memory chip size limitations put us into this awkward spot. Is anyone betting on the lowly, $329 MSRP 3060 aging weirdly well because they all came with 12GB of VRAM? Hell, what's the story with these 3080 20GB? I never saw benchmarks of them: https://www.tomshardware.com/news/geforce-rtx-3080-20gb-gpus-emerge-for-around-dollar575.

I wonder what those 3080 20GB sold for new. VRAM is cheap.

FuzzySlippers
Feb 6, 2009

Just running recent games on max details at ultrawide 1440p had me maxing 12gb VRAM and I just moved up to 1600p UW which I assume is gonna be worse. Games aimed at only the current console gen are hungrier for both VRAM and system RAM. Especially when running balls out with additional pc gaming fuckery on top of it whether that's higher detail/resolution or just having to share memory with random other open poo poo. I've noticed browser tabs getting greedier with VRAM though I don't know if that's a bug or on purpose.

Crazy game modding can also absolutely blow up your VRAM since modders are using it less efficiently. You can make Skyrim look pretty impressive these days but doing it requires laughable specs.

Any kind of AI tinkering will blow past 16gb easily but that's an entirely different thing and who knows if locally run models are going to be anything that matters over the next decade for people not engaged in actual AI work. I was impressed with what I could do for funzies, but the responsiveness gain from running local is pretty minimal compared to losing all the easily accessible power available remotely. Maybe someone will come up with an interesting usage for local.

Saturnine Aberrance
Sep 6, 2010

Creator.

Please make me flesh.


SpaceDrake posted:

This is about VRAM, not normal RAM. (Which you are right about w/r/t 32GB. :v:)

If you want 32 GB of VRAM on something, you'd better be ready to wait a few more years and also be ready to pay $2000.

omg lmao. I saw the 16 and instantly my brain swapped over to system ram not vram, woops.

Yeah totally disregard my anecdote.

Cyrano4747
Sep 25, 2006

Yes, I know I'm old, get off my fucking lawn so I can yell at these clouds.

FuzzySlippers posted:

Just running recent games on max details at ultrawide 1440p had me maxing 12gb VRAM and I just moved up to 1600p UW which I assume is gonna be worse.

sure, but UW isnt exactly apples to apples with normal resolutions. Its significantly more pixels to push.

I agree that 8gb is barely adequate today and manifestly inadequate for the future, but I don't think we're quite at 12gb being a hard floor either.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

resolution mostly affects VRAM usage because it involves higher-resolution textures, I thought, and not because the target framebuffers were larger. if that’s the case then I wouldn’t expect 1600 UW to use much more VRAM than 1440 UW, because they’ll use the same textures and shadowmap resolutions and such?

repiv
Aug 13, 2009

there's no straightforward answer to that because it depends on the streaming/eviction strategy the engine uses, and the assets (how often are textures repeated)

Indiana_Krom
Jun 18, 2007
Net Slacker
But it is safe to say the framebuffer itself in isolation isn't a significant driver of vram consumption anymore.

repiv
Aug 13, 2009

i don't think framebuffer size was ever that big of a concern really, save for on the xbox 360 where it had to fit into the 10mb of fast memory

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

repiv posted:

i don't think framebuffer size was ever that big of a concern really, save for on the xbox 360 where it had to fit into the 10mb of fast memory

The Xbox One was stuck at 900p in most games because its 32MB of fast memory couldn't fit all of the frame-sized buffers that a modern graphics pipeline needs. Even MGS5 is stuck at 900p for all eternity on today's vastly more powerful Xbox.

repiv
Aug 13, 2009

oh yeah that too, microsoft couldn't help but make the same mistake twice

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
https://www.tomshardware.com/news/d...sanctioned-tech

AMD won't be able to get a leg up in China either, Joe Biden banned the 7900XTX.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
7900 XT too, but not the 4080? Or we just haven't heard yet.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
They're measuring the raw computational power, not framerates in Cyberpunk. The 7900XT is a more powerful GPU than the 4080 and over the limit, but the 4080 sneaks in under the limit.

SpaceDrake
Dec 22, 2006

I can't avoid filling a game with awful memes, even if I want to. It's in my bones...!
Which is to say, that rule is going to get absolutely hilarious in 2025 and beyond unless it gets the poo poo lobbied out of it. Nvidia is not going to appreciate having the 5080 (and maybe even the 5070Ti?) sanctioned off, and AMD is going to be mad about losing upwards of half its catalogue for export.

BurritoJustice
Oct 9, 2012

Twerk from Home posted:

They're measuring the raw computational power, not framerates in Cyberpunk. The 7900XT is a more powerful GPU than the 4080 and over the limit, but the 4080 sneaks in under the limit.

The 7900XTX isn't close to the TPP limit for the export restrictions, and the 4080 is actually higher TPP (though still also well under the TPP limit).

The formula is bits * TFLOPS, and the limit is a max of 4800 TPP in any precision the GPU supports. The 4090 exceeds it with 330 TFLOPS at FP16, for 5280 TPP. The 4080 has 195 TFLOPS at FP16, for 3120 TPP. As noted in the article, the 7900XTX only has a TPP of 1962 at FP16.

NVIDIA's tensor cores count for the bans, which is why their TPP is so much higher across the board. If they are banning the AMD GPUs it's a special case, unrelated to the standard TPP bans.

Adbot
ADBOT LOVES YOU

Branch Nvidian
Nov 29, 2012



SpaceDrake posted:

AMD is going to be mad about losing upwards of half its catalogue for export.

Not if they just don't make the upper half of the catalogue anymore! :pseudo:

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply